Capstone Project

Getting the input data

Datasets and Inputs

Daily stock prices the last five years for each stock (if available, some of the companies don’t have that long history on the stock market) will be extracted from Yahoo! Finance automatically. Stocks that don’t have three years of history will be excluded. In the case of any of the stocks are missing, the stock data will not be extracted from other sources to not complicate the matter. After the automatic download, the data will be stored in .csv files, one for each stock, where the information later can be extracted and as an extra backup.

The reason for this is that Yahoo! Finance has turned out to be less reliable, with less consistent data, problems in downloading it and closing its API service, after the acquisition by Verizon in 2016. Other sources such as Google, Quandl, Alpha vantage, Nordnet, Avanza, and some other Swedish alternatives, have been consulted but with even worse alternatives.

The data will include information such as date, open and closing price for each particular day, high and low, volume, as well as adjusted closing price. The latter is adjusted so that it takes into account changes in the stock price due to splits and dividends, while the rest only is adjusted to splits. It is crucial that the input data takes into account splits, because a stock’s price can be reduced (or increased) ten times through a split without having an effect on the company’s value. Running an algorithm on such data would probably result in worthless results.

In [5]:
####### For reproducible results
#from numpy.random import seed
#seed(1)
#from tensorflow import set_random_seed
#set_random_seed(2)
#######

import pandas as pd
import copy
import datetime
from pandas_datareader import data
import time
from retrying import retry
import matplotlib.pyplot as plt
from matplotlib.ticker import MaxNLocator, IndexFormatter
import numpy as np
import math
import operator

##### Introduce some useful functions ####

# Yahoo! Finance isn't that reliable and might throw a RemoteDataError if we try to get the data too fast. 
# For this reason, add a 

#@retry function to minimize the risk of RemoteDataError. 10 retries, with one
# second pause in between each run.
@retry(stop_max_attempt_number=10)
def get_stock_data(ticker, dates):
    try:
        #stock = pd.DataFrame(index=dates) #
        stock = data.get_data_yahoo(ticker, start, end)
        stock.sort_index(ascending=False, inplace=True)
        return stock
    except RemoteDataError:
        time.sleep(1)
        print('Trying again..')
        
def create_file_path(ticker):
    """Create a file path to store file(s)"""
    base = '/Users/jakob/Desktop/Programming/Udacity Machine Learning Nano Degree/Capstone Project/Data_Capstone/'
    return(base + ticker + '.csv')   

def fill_missing_values(data_df):
    """Fill missing values forward, then backwards"""
    data_df.fillna(method="ffill", inplace=True)
    data_df.fillna(method="bfill", inplace=True)
    
In [6]:
start = datetime.datetime(2012,12,27)
end = datetime.date.today()
dates = pd.date_range(start, end)

stock_names = ['OMX Stockholm 30', 'Acando B', 'Addnode Group B', 'Addtech B', 'Africa Oil', 
              'AQ Group', 'Arcam', 'Beijer Alma', 'Beijer Ref', 'BioGaia B', 'Biotage', 
              'BlackPearl R. Inc.','Bulten', 'Bure Equity','Byggmax','Catella A','Catella B', 
              'Catena', 'Cavotec SA','CellaVision', 'Clas Ohlson B', 'Cloetta B', 
              'Concentric', 'Creades A', 'Diös Fastigheter', 'Duni', 'Elanders B', 
              'EnQuest PLC', 'Fagerhult', 'Fast Partner','G5 Entertainment', 'Gunnebo', 
              'Haldex', 'Hansa Medical', 'Heba B', 'HiQ International', 'HMS Networks', 
              'IAR Systems', 'INVISIO Communications', 'Kabe B', 'KappAhl', 'Karo Pharma',
              'Knowit','Lindab International','Lucara Diamond Corp.','Medivir B','Mekonomen', 
              'Midsona A', 'Midsona B', 'Mycronic', 'Nederman Holding', 'Net Insight B', 
              'New Wave B', 'Nolato B', 'OEM International B', 'Opus Group','Orexo', 'Probi', 
              'Qliro Group', 'RaySearch Laboratories B', 'Rezidor Hotel Group', 'SAS', 
              'Semafo', 'SkiStar B', 'Starbreeze B','Swedol B','Systemair','Tethys Oil', 
              'Traction B', 'VBG Group B','Vitrolife','Xvivo Perfusion','Öresund Investment']

tickers = ['^OMX', 'ACAN-B.ST', 'ANOD-B.ST', 'ADDT-B.ST', 'AOI.ST', 'AQ.ST', 'ARCM.ST', 
           'BEIA-B.ST', 'BEIJ-B.ST', 'BIOG-B.ST', 'BIOT.ST', 'PXXS-SDB.ST','BULTEN.ST',
           'BURE.ST', 'BMAX.ST', 'CAT-A.ST', 'CAT-B.ST', 'CATE.ST', 'CCC.ST', 'CEVI.ST', 
           'CLAS-B.ST', 'CLA-B.ST', 'COIC.ST', 'CRED-A.ST', 'DIOS.ST', 'DUNI.ST', 'ELAN-B.ST', 
           'ENQ.ST', 'FAG.ST', 'FPAR.ST', 'G5EN.ST', 'GUNN.ST', 'HLDX.ST', 'HMED.ST', 
           'HEBA-B.ST', 'HIQ.ST', 'HMS.ST', 'IAR-B.ST', 'IVSO.ST', 'KABE-B.ST', 'KAHL.ST', 
           'KARO.ST', 'KNOW.ST', 'LIAB.ST', 'LUC.ST', 'MVIR-B.ST','MEKO.ST', 'MSON-A.ST', 
           'MSON-B.ST', 'MYCR.ST', 'NMAN.ST', 'NETI-B.ST', 'NEWA-B.ST','NOLA-B.ST','OEM-B.ST', 
           'OPUS.ST', 'ORX.ST', 'PROB.ST', 'QLRO.ST', 'RAY-B.ST', 'REZT.ST', 'SAS.ST',
           'SMF.ST', 'SKIS-B.ST', 'STAR-B.ST', 'SWOL-B.ST', 'SYSR.ST', 'TETY.ST', 'TRAC-B.ST', 
           'VBG-B.ST', 'VITR.ST', 'XVIVO.ST', 'ORES.ST']

Download and save the data into .csv files

Many of the stocks are missing data values. To fix this, pandas' inbuilt functions 'fill forward' and 'fill backwards' are used. They are used and defined in the function fill_missing_values() above. This is a common approach when dealing with incomplete times series data.

In [ ]:
### Download and save the data into .csv files ###

#time1 = time.time()    
      
#for i in tickers:
#    stock_df = get_stock_data(i, dates)
    
#    # Fill missing values forward, then, fill backward
#    fill_missing_values(stock_df)
    
#    # Save the files as .csv as well
#    stock_df.to_csv(create_file_path(i)) 
    
    
#print("Total time to download the data: {0:0.0f} s".format(time.time() - time1))
In [ ]:
#display(stock_df.head())

Import the data from the .csv files (if they already are there).

What's happening below? First, all the .csv files are loaded into dataframes and stored in a list called file_names. Each dataframe contains all the data for one stock. Consequently, each position in the list contains the stock data for one stock. Further on, the stock name is added as a new column in each stock dataframe.

Set the index column to the Date column when using .read_csv. This will later facilitate when normalizing the data.

In [7]:
import os
from glob import glob


# Count the number of files in the input data directory
directory = '/Users/jakob/Desktop/Programming/Udacity Machine Learning Nano Degree/Capstone Project/Data_Capstone/'
file_paths = glob(directory+"*.csv")  # Get each .csv file in the directory

# Get all the file names
file_names = []
for root, dirs, files in os.walk(directory):  
    for filename in files:
        filename = filename[:-4]   # Just keep the ticker name, without the .csv file extention
        file_names.append(filename)
del file_names[0]
        
# Define a common index for all dataframes
m = pd.read_csv(file_paths[0], index_col='Date')
glob_index = m.index

# Get the input data from the .csv files
loaded_stocks = []
for i in range(len(file_paths)):
    stock = pd.read_csv(file_paths[i], index_col='Date')
    stock['Volatility'] = (stock['High'] - stock['Low']) / stock['Open']  # Calculate the volatility
    stock.index.names = [file_names[i] + '__' + 'Date']                   # Change the index name to stock name + Date
    loaded_stocks.append(stock)

dim = loaded_stocks[1].shape    
print("Total amount of input data points: {0}".format(dim[0] * dim[1] * len(loaded_stocks)))
print("Number of stocks: ", len(loaded_stocks))
#print(this_stock.to_string())   # print the entire dataframe
display(loaded_stocks[1].head())


#print(loaded_stocks[1].index.name[:-6])
Total amount of input data points: 665322
Number of stocks:  73
Open High Low Close Adj Close Volume Volatility
ACAN-B.ST__Date
2018-03-02 28.549999 28.950001 28.150000 28.700001 28.700001 122849 0.028021
2018-03-01 29.549999 29.600000 29.000000 29.000000 29.000000 120838 0.020305
2018-02-28 29.799999 29.799999 29.500000 29.650000 29.650000 60613 0.010067
2018-02-27 29.700001 30.000000 29.700001 29.850000 29.850000 53990 0.010101
2018-02-26 29.500000 30.000000 29.500000 29.650000 29.650000 71179 0.016949

Normalize the stock data according to the first date (2012-12-27) in each stock dataframe.

Display the top and bottom five values, both unchanged and normalized.

In [8]:
def normalize_data(prices):
    """ Normalize data stored in prices"""
    if isinstance(prices, pd.DataFrame): # Check if Dataframe
        prices = prices/prices.iloc[-1]#normalize according to the first date value (which now is in the end of the df)
    else:                                # if array
        prices = prices/prices[-1]     
    return prices  

# Normalize all the stock prices
#col = ['Open', 'High', 'Low', 'Close', 'Adj Close']
norm_stock_prices = []
for i in loaded_stocks:
    norm_d = normalize_data(i)
    norm_d[norm_d == np.inf] = 0          # if any of the value in norm_d is an infinite value, set it equal to 0
    fill_missing_values(norm_d)
    norm_d = norm_d[~norm_d.index.duplicated(keep='last')]  # Remove duplicated indices (if any)    
    norm_stock_prices.append(norm_d)

#display(loaded_stocks[1].head(5), loaded_stocks[1].iloc[-5:, :])
display(norm_stock_prices[1].head(), norm_stock_prices[1].iloc[-5:, :])

print(len(norm_stock_prices[1]))
Open High Low Close Adj Close Volume Volatility
ACAN-B.ST__Date
2018-03-02 2.130597 1.942953 2.100746 1.926175 2.631674 2.732224 0.250321
2018-03-01 2.205224 1.986577 2.164179 1.946309 2.659183 2.687499 0.181387
2018-02-28 2.223881 2.000000 2.201493 1.989933 2.718785 1.348064 0.089933
2018-02-27 2.216418 2.013423 2.216418 2.003356 2.737124 1.200765 0.090235
2018-02-26 2.201493 2.013423 2.201493 1.989933 2.718785 1.583057 0.151412
Open High Low Close Adj Close Volume Volatility
ACAN-B.ST__Date
2013-01-04 1.104478 1.006711 1.074627 1.000000 1.000000 0.095434 0.362162
2013-01-03 1.104478 1.006711 1.100746 1.006711 1.006711 0.042079 0.150901
2013-01-02 1.149254 1.033557 1.037313 1.006711 1.006711 0.455063 0.870130
2012-12-28 1.082090 1.033557 1.044776 1.033557 1.033557 3.355092 0.862529
2012-12-27 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000 1.000000
1302
In [9]:
#for tick in tickers:
#    display(get_stock(norm_stock_prices, tick).head())
In [10]:
# Define some more useful functions.

def get_stock(stock_list ,ticker):
    """ Returns the dataframe containing the stock with the ticker symbol ticker. 
        stock_list is a list of stock data stored in dataframes. """
    for sd in stock_list:
        if sd.index.name[:-6] == ticker:
            return sd            

def get_ticker(DataFrame):
    """Return the ticker symbol for the stock in DataFrame"""
    return DataFrame.index.name[:-6]
  

# Just check so that it works as intended
starbreeze_df = get_stock(loaded_stocks ,'STAR-B.ST')
display(starbreeze_df.head()) 
Open High Low Close Adj Close Volume Volatility
STAR-B.ST__Date
2018-03-02 9.940 10.18 9.895 9.960 9.960 131028.0 0.028672
2018-03-01 10.500 10.60 9.955 9.955 9.955 791073.0 0.061429
2018-02-28 9.705 10.59 9.705 10.490 10.490 1284906.0 0.091190
2018-02-27 9.660 9.91 9.600 9.895 9.895 987261.0 0.032091
2018-02-26 9.640 9.79 9.605 9.655 9.655 468709.0 0.019191

Define functions for plotting the dataframes. There might exist better and easier solutions, but in order to get acceptable plots, quite some adjustments are needed.

In [11]:
from mpl_toolkits.axes_grid1.inset_locator import zoomed_inset_axes
from mpl_toolkits.axes_grid1.inset_locator import mark_inset


# Define a function for plotting a dataframe
def plot_it(data_df, title='', xlabel='', ylabel='', legend=''):
    """Plot the stock stored in data_df, title is the plot title"""
    plt.figure(figsize=(10, 5))
    pl = data_df.loc[::-1].plot(fontsize=12, figsize=(13, 5))
    
    pl.set_title(label=title, fontsize=20)
    pl.set_xlabel(xlabel, fontsize=15) 
    plt.autoscale(enable=True, axis='x', tight=True)    
    pl.set_ylabel(ylabel, fontsize=15)
    plt.legend(fontsize=12, loc='upper left') # [legend],
    plt.grid(axis='both', alpha=.5)
    pl.xaxis.set_major_locator(MaxNLocator(12))
    pl.xaxis.set_major_formatter(IndexFormatter(data_df.index[::-1]))
    plt.xticks(rotation=50, horizontalalignment='center', rotation_mode='default')
    plt.show()


# Define a function for plotting two dataframes in different colours in the same plot
def plot_2it(data_df1, data_df2, label1='', label2='', title=''):
    """Plot the stock data stored in data_df1 and data_df2 in different colors. 
        label1 and label2 are the label names while title is the plot title"""
    plt.figure(figsize=(10, 5))
    pl = data_df1.loc[::-1].plot(fontsize=12, figsize=(13, 5), label=label1, color='green') 
    plt.plot([None for i in data_df1.loc[::-1]] + [x for x in data_df2.loc[::-1]], label=label2, color='royalblue')
    
    pl.set_title(label=title, fontsize=20)
    pl.set_xlabel('Date', fontsize=15) 
    plt.autoscale(enable=True, axis='x', tight=True)    
    pl.set_ylabel('Price', fontsize=15)
    plt.legend(fontsize=12, loc='upper left')
    plt.grid(axis='both', alpha=.5)
    pl.xaxis.set_major_locator(MaxNLocator(12))
    temp = pd.concat([data_df2, data_df1], axis=1)
    pl.xaxis.set_major_formatter(IndexFormatter(temp.index))
    plt.xticks(rotation=50, horizontalalignment='center', rotation_mode='default')
    plt.show()   

    
# Define a function for plotting two dataframes in different colours in the same plot
def plot_3it(data_df1, data_df2, data_df3, label1='', label2='', label3='', title=''):
    """Plot the stock data stored in data_df1 and data_df2 in different colors. 
        The entire dataset is stored in data_df3. 
        label1, label2 and label3 are the label names while title is the plot title"""
    plt.figure(figsize=(10, 5))
    pl = data_df1.loc[::-1].plot(fontsize=12, figsize=(13, 5), label=label1, color='green') 
    plt.plot([None for i in data_df1.loc[::-1]] + [x for x in data_df2.loc[::-1]], label=label2, color='royalblue')
    plt.plot(data_df3.loc[::-1], label=label3, color='darkorange')
    
    pl.set_title(label=title, fontsize=20)
    pl.set_xlabel('Date', fontsize=15) 
    plt.autoscale(enable=True, axis='x', tight=True)    
    pl.set_ylabel('Price', fontsize=15)
    plt.legend(fontsize=12, loc='upper left')
    plt.grid(axis='both', alpha=.5)
    pl.xaxis.set_major_locator(MaxNLocator(12))
    temp = pd.concat([data_df2, data_df1], axis=1)
    pl.xaxis.set_major_formatter(IndexFormatter(temp.index))
    plt.xticks(rotation=50, horizontalalignment='center', rotation_mode='default')
    plt.show()   
In [12]:
# Plot the stocks and compare every single one with the OMX Stockholm 30 index
count = 0
for stock in norm_stock_prices[1:]:
    temp_df = pd.concat([stock.loc[:, 'Adj Close'], norm_stock_prices[0].loc[:, 'Adj Close']],
                       keys=[get_ticker(stock), get_ticker(norm_stock_prices[0])], axis=1)
    fill_missing_values(temp_df)
    concat_df = temp_df.set_index(glob_index) # Set index equal to OMX's index   
    if concat_df.iloc[-1,0] != 1.0:   # If the dataframe is inverted, correct it.
        concat_df = concat_df[::-1]
        #concat_df = concat_df.reverse()   # reverse() can also be used
        concat_df.index = concat_df.index[::-1]
        
    #if count <= 10:
    #    plot_it(concat_df, xlabel='Date', ylabel='Price')
    count += 1
    

They all seem to be correct. So, we have downloaded, imported, normalized, filled empty values and plotted data for 73 different stocks, OMX Stockholm 30 included, successfully.

MinMaxScale and Train-Test Split

MinMaxScale, do a train-test split and plot the resulting plot for the first 10 stocks. Different colors for train and test set.

In [13]:
# Create a copy to keep scaled and normalized data apart. [Have to use copy.deepcopy()]

scaled_LOG_stock_prices = copy.deepcopy(norm_stock_prices)
In [14]:
from sklearn.preprocessing import MinMaxScaler

##### FÖR ATT MAN SKA KUNNA ANVÄNDA fit_transform SÅ MÅSTE INPUTDATAN VARA MAX 2D


#We have to specify one scaler for each column. There are different min and max values in each column and
#the MinMaxScaler will therefore be tuned slightly different for each one of them. 
#It is needed to get the correct output later on. 
# Create a MinMaxScaler for each column in each stock and store it in the dictionary scaled_stock_prices.
# The name of the scaler is the stock ticker + column number.
# [0, 1, 2, 3, 4, 5, 6] <=> ['Open', 'High', 'Low', 'Close', 'Adj Close', 'Volume', 'Volatility']

#Specify one scaler for each column and stock
many_MinMaxScalers = {}
for i_s in range(len(norm_stock_prices)):
    for j_s in range(7):
        many_MinMaxScalers["{0}".format(get_ticker(norm_stock_prices[i_s])+str(j_s))] = MinMaxScaler(feature_range=(0,1))
In [15]:
print(len(many_MinMaxScalers))
511
In [16]:
def MMscale_data(data):
    """ A function for scaling the data in the dataframe data"""
    for i in range(len(data.columns)):
        data.iloc[:,i] = many_MinMaxScalers[get_ticker(data)+str(i)].fit_transform(data.iloc[:,i].values.reshape(-1,1))
    return data


def Un_scale_data(data, ticker=' '):
    """ A function for unscaling the data in the variable data.
        No ticker is needed if data is a dataframe while it is needed if data is an array. """
    if ticker == ' ' or isinstance(data, pd.DataFrame):  # if dataframe
        data.iloc[:,] = many_MinMaxScalers[get_ticker(data)+str(4)].inverse_transform(data.iloc[:,].values.reshape(-1,1))   
    else: # is an array
        data = many_MinMaxScalers[ticker+str(4)].inverse_transform(data.reshape(-1,1))
    return data


def Un_scale_data_whole(data):
    """ A function for unscaling the data in the dataframe data"""
    for c in range(len(data.columns)):
        data.iloc[:,c] = many_MinMaxScalers[get_ticker(data)+str(c)].inverse_transform(data.iloc[:,c].values.reshape(-1,1))
    return data
In [17]:
LOG_norm_train_list, LOG_norm_test_list = [], []
LOG_scaled_train_list, LOG_scaled_test_list = [], []


# Plot the first 10 stocks, OMX 30 excluded
for i in range(1, len(scaled_LOG_stock_prices)):
    test_size = int(len(scaled_LOG_stock_prices[i]) * 0.20)     # Specify the test size
    MMscale_data(scaled_LOG_stock_prices[i])
    
    # Scaled and normalized
    LOG_train, LOG_test = scaled_LOG_stock_prices[i][test_size:], scaled_LOG_stock_prices[i][0:test_size]
    # save each one into a list
    LOG_scaled_train_list.append(LOG_train)
    LOG_scaled_test_list.append(LOG_test) 
    
    # Normalized
    LOG_norm_train, LOG_norm_test = norm_stock_prices[i][test_size:], norm_stock_prices[i][0:test_size]
    LOG_norm_train_list.append(LOG_norm_train)
    LOG_norm_test_list.append(LOG_norm_test)   
    #if i <= 10:
    #    plot_2it(LOG_norm_train.loc[:, 'Adj Close'], LOG_norm_test.loc[:, 'Adj Close'], 
    #             'Training set', 'Test set', get_ticker(norm_stock_prices[i]))
    
    
print("Training samples: {0}".format(len(LOG_train)))
print("Testing samples: {0}".format(len(LOG_test)))
Training samples: 1042
Testing samples: 260

Make combination plots

In [2]:
#for k in range(len(LOG_norm_train_list)):
#    plot_3it(LOG_norm_train_list[k].loc[:, 'Adj Close'], LOG_norm_test_list[k].loc[:, 'Adj Close'], 
#             norm_stock_prices[0].loc[:, 'Adj Close'], 'Training set', 'Test set', 'OMX30', 
#             title = get_ticker(LOG_norm_train_list[k]))
        

Display the intersection between the training and test set

In [15]:
display(LOG_norm_test_list[0].iloc[-3:, :], LOG_norm_train_list[0].iloc[:3, :])
Open High Low Close Adj Close Volume Volatility
ACAN-B.ST__Date
2017-02-27 2.425373 2.214765 2.365672 2.161074 2.833405 8.054200 0.357333
2017-02-24 2.380597 2.167785 2.268657 2.154362 2.824606 11.125837 0.532079
2017-02-23 2.455224 2.208054 2.246269 2.140940 2.807007 10.949381 0.760284
Open High Low Close Adj Close Volume Volatility
ACAN-B.ST__Date
2017-02-22 2.477612 2.228188 2.432836 2.194631 2.877402 5.158152 0.161447
2017-02-21 2.470149 2.241611 2.462687 2.228188 2.921399 4.770522 0.107956
2017-02-20 2.462687 2.221476 2.432836 2.214765 2.903800 6.160910 0.135354

Logistic Regression Model

In [16]:
from keras.models import Sequential
from keras.layers import Dense, Dropout, Activation, LSTM, LeakyReLU
from keras import optimizers
from keras import regularizers
np.set_printoptions(threshold=1000)


""" Define and compile a simple logistic regression model """
def logistic_regression_model(output_size, neurons, activ_func='relu', 
                              optimizer='adam', loss='mean_squared_error'):
    model = Sequential()
    model.add(Dense(output_size, activation=activ_func, input_shape=(7,)))
    #model.add(Dropout(dropout))   ### Bättre resultat utan Dropout (innan)

    model.compile(optimizer = optimizer, loss = loss)   #, metrics=['accuracy'])
    model.summary()
    return model
Using TensorFlow backend.

Scaling the data

Our training input data consist of 1037 rows and 7 columns while the training output consists of 1037 rows and one column ('Adjusted Close'). Likewise, the testing data consist of 258 rows and seven columns in the input and one as for the output. The reason to chose only one output column is that we are really only interested in predicting the closing adjusted price of the stock.

In [17]:
# Create the datasets. Training and testing inputs as well as outputs. 
LOG_train_inputs = copy.deepcopy(LOG_scaled_train_list[0][::-1][:-1])       # Remove the last day
LOG_train_outputs = copy.deepcopy(LOG_scaled_train_list[0][::-1].iloc[1:, :])      # Move 1 day ahead and choose 
                                                                            # Adjusted Close as only output
LOG_test_inputs = copy.deepcopy(LOG_scaled_test_list[0][::-1][:-1])
LOG_test_outputs = copy.deepcopy(LOG_scaled_test_list[0][::-1].iloc[1:, :])

print(LOG_train_inputs.shape)
print(LOG_train_outputs.shape)
print()
print(LOG_test_inputs.shape)
print(LOG_test_outputs.shape)
(1041, 7)
(1041, 7)

(259, 7)
(259, 7)
In [18]:
# Random seed for reproducibility
np.random.seed(45)

# Build the model architecture
LOG_model = logistic_regression_model(output_size=7, neurons=30)
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_1 (Dense)              (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________

Training the model

In [19]:
# Train the model
trained_model = LOG_model.fit(LOG_train_inputs, LOG_train_outputs, epochs=20, batch_size=1, 
                              verbose=2, shuffle=True, validation_split=0.05) 
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0159 - val_loss: 0.0995
Epoch 2/20
 - 1s - loss: 0.0105 - val_loss: 0.0958
Epoch 3/20
 - 1s - loss: 0.0103 - val_loss: 0.0961
Epoch 4/20
 - 2s - loss: 0.0102 - val_loss: 0.0945
Epoch 5/20
 - 2s - loss: 0.0101 - val_loss: 0.0947
Epoch 6/20
 - 2s - loss: 0.0101 - val_loss: 0.0942
Epoch 7/20
 - 2s - loss: 0.0100 - val_loss: 0.0938
Epoch 8/20
 - 2s - loss: 0.0100 - val_loss: 0.0947
Epoch 9/20
 - 2s - loss: 0.0100 - val_loss: 0.0940
Epoch 10/20
 - 2s - loss: 0.0100 - val_loss: 0.0950
Epoch 11/20
 - 2s - loss: 0.0100 - val_loss: 0.0957
Epoch 12/20
 - 2s - loss: 0.0100 - val_loss: 0.0948
Epoch 13/20
 - 2s - loss: 0.0100 - val_loss: 0.0944
Epoch 14/20
 - 2s - loss: 0.0100 - val_loss: 0.0943
Epoch 15/20
 - 1s - loss: 0.0100 - val_loss: 0.0948
Epoch 16/20
 - 2s - loss: 0.0100 - val_loss: 0.0942
Epoch 17/20
 - 1s - loss: 0.0100 - val_loss: 0.0944
Epoch 18/20
 - 2s - loss: 0.0100 - val_loss: 0.0940
Epoch 19/20
 - 2s - loss: 0.0100 - val_loss: 0.0936
Epoch 20/20
 - 2s - loss: 0.0100 - val_loss: 0.0953

Plot the training error

We would expect this to decrease over time.

In [20]:
def plot_error(model):
    """ Plot the error and some statistics. """
    fig, ax1 = plt.subplots(1,1, figsize=(10, 5))
    ax1.plot(model.epoch, model.history['loss'])
    ax1.set_title('Training Error')
    ax1.set_ylabel('Loss',fontsize=12)
    ax1.set_xlabel('# Epochs',fontsize=12)
    plt.show()

# Plot the error
plot_error(trained_model)

trainScore = LOG_model.evaluate(LOG_train_inputs, LOG_train_outputs, verbose=0)
testScore = LOG_model.evaluate(LOG_test_inputs, LOG_test_outputs, verbose=0)
print("Mean Squared Error on the training data: {0:0.5f}".format(trainScore))
print("Mean Squared Error on the test data:     {0:0.5f}".format(testScore))
Mean Squared Error on the training data: 0.01434
Mean Squared Error on the test data:     0.11133

Performance on the training and test sets

Now, check how our model performs on the training and test sets by plotting the real and predicted values and compare them. A zoomed in plot is also used. The idea is taken from this source: http://akuederle.com/matplotlib-zoomed-up-inset

In [21]:
from mpl_toolkits.axes_grid1.inset_locator import zoomed_inset_axes
from mpl_toolkits.axes_grid1.inset_locator import mark_inset

# A function for plotting a dataframe with a zoomed in plot
def plot_zoom(data_df, title='', xlabel='', ylabel=''):
    """Plot the stock stored in data_df, title is the plot title"""
    plt.figure(figsize=(10, 5))
    pl = data_df.loc[::-1].plot(fontsize=12, figsize=(13, 5)) # legend=None,
    pl.set_title(label=title, fontsize=20)
    pl.set_xlabel(xlabel, fontsize=15) 
    plt.autoscale(enable=True, axis='x', tight=True)    
    pl.set_ylabel(ylabel, fontsize=15)
    plt.legend(fontsize=12, loc='upper left')
    plt.grid(axis='both', alpha=.5)
    pl.xaxis.set_major_locator(MaxNLocator(12))
    pl.xaxis.set_major_formatter(IndexFormatter(data_df.index[::-1]))
    plt.xticks(rotation=50, horizontalalignment='center', rotation_mode='default')
    
    # The zoomed in window
    lg = int(len(data_df)*0.1)
    axins = zoomed_inset_axes(pl, 2.5, loc=9)
    axins.plot(data_df.loc[::-1])
    x1, x2 = data_df.index[lg,], data_df.index[0]        # specify the limits
    y1 = data_df.loc[data_df.index[0]:data_df.index[lg],'True Values'].min()
    y2 = data_df.loc[data_df.index[0]:data_df.index[lg],'True Values'].max()
    axins.set_xlim(x1, x2), axins.set_ylim(y1, y2)       # apply the x-limits, apply the y-limits
    axins.set_facecolor('whitesmoke')
    axins.axis[:].set_visible(False)                     # Remove the 4 borders
    mark_inset(pl, axins, loc1=2, loc2=4, fc="none", ec="1.5") # Add some lines for the zoom effect
    plt.show()
    
    
# A function for plotting three dataframes and a zoomed in plot
def plot_3zoom(data_df1, data_df2, data_df3, title='', xlabel='', ylabel='', zoom=True):
    """data_df1 contains the train set, data_df2 contains the test set and data_df3 contains the entire dataset. 
       title is the plot title. If a zoomed in window is desired, set zoom to True"""
    line_w, line_zoom = 1.0, 1.5    # line width for the main and zoomed plot
    # Plot the predicted train and test data
    diff = len(data_df3)-len(data_df2)-len(data_df1)
    pl = data_df1.plot(color='orchid', fontsize=12, figsize=(16, 7), label=data_df1.columns[0], linewidth=line_w)
    pred = np.empty_like(data_df3)
    pred[:, :] = np.nan
    pred[len(data_df1)+diff:len(data_df3), :] = data_df2
    plt.plot(pred, color='darkorange', label=data_df2.columns[0], linewidth=line_w)
    
    # Plot the actual values
    plt.plot(data_df3, color='green', label=data_df3.columns[0], linewidth=line_w) 
    
    pl.set_title(label=title, fontsize=20)
    pl.set_xlabel(xlabel, fontsize=15) 
    plt.autoscale(enable=True, axis='x', tight=True)    
    pl.set_ylabel(ylabel, fontsize=15)
    plt.legend(fontsize=12, loc='upper left')
    plt.grid(axis='both', alpha=.5)
    pl.xaxis.set_major_locator(MaxNLocator(12))
    pl.xaxis.set_major_formatter(IndexFormatter(data_df3.index[::-1]))
    plt.xticks(rotation=50, horizontalalignment='center', rotation_mode='default')
    
    if zoom:
        ## The zoomed in window
        lg = int(len(data_df3)*0.1)
        axins = zoomed_inset_axes(pl, 2.5, loc=9)
        axins.plot(data_df1.iloc[::-1], color='orchid', linewidth=line_zoom)
        axins.plot(pred, color='darkorange', label=data_df2.columns[0], linewidth=line_zoom)
        axins.plot(data_df3.iloc[::-1], color='green', linewidth=line_zoom)
        x1, x2 = data_df1[::-1].index[-lg//2,], data_df2[::-1].index[lg//2]    # specify the limits
        
        # Check for the max and min y values in the actual values, within the x limits.
        yA1 = data_df3.loc[data_df3.index[test_size-lg//2]:data_df3.index[test_size+lg//2],'Actual Data'].min()
        yA2 = data_df3.loc[data_df3.index[test_size-lg//2]:data_df3.index[test_size+lg//2],'Actual Data'].max()
        
        # Check for the max and min y values in the train set, within the x limits.
        yTr1 = data_df1.iloc[-lg//2:, 0].min()
        yTr2 = data_df1.iloc[-lg//2:, 0].max()
        
        # Check for the max and min y values in the test set, within the x limits.
        yTe1 = data_df2.iloc[:lg//2, 0].min()
        yTe2 = data_df2.iloc[:lg//2, 0].max()
        
        ys = [yA1, yA2, yTr1, yTr2, yTe1, yTe2]
        ymax, ymin = max(ys), min(ys)                       # find the max and min values among the different y's
        axins.set_xlim(x1, x2), axins.set_ylim(ymin, ymax)         # apply the x-limits, apply the y-limits
        axins.set_facecolor('whitesmoke')
        axins.axis[:].set_visible(False)                           # Remove the 4 borders
        mark_inset(pl, axins, loc1=2, loc2=4, fc="none", ec="1.5") # Add some lines for the zoom effect
    plt.show()
In [22]:
df1 = pd.DataFrame(data=(np.transpose(LOG_model.predict(LOG_norm_train_list[0][:-1].values)))[0], 
                     index=LOG_train_inputs.index, columns=['Predictions on the Train set'])
df2 = pd.DataFrame(data=(np.transpose(LOG_model.predict(LOG_norm_test_list[0][:-1].values)))[0],
                  index=LOG_test_inputs.index, columns=['Predictions on the Test set'])
df3 = pd.DataFrame(data=norm_stock_prices[1].loc[:, 'Adj Close'])
df3.columns = ['Actual Data']
name = get_ticker(LOG_norm_train_list[0])
plot_3zoom(df1[::-1], df2[::-1], df3, title='Logistic Regression Performance on the Training and Test Sets, ' + name, 
           xlabel='Date', ylabel='Price', zoom=False)

In the above figure, the actual values are plotted in green while the blue line represents the predicted values for the training set and the orange the predicted values for the test set. It seems to perform fairly poor on both the sets, predicting nothing more than the previous day's value.

Plot a couple of Logistic Regression predictions

In [32]:
def plot_some_LOG_models():
    # LOG_train_list and LOG_test_list contains the scaled stock values.
    global_time = time.time()
    nbr = 1
    for i in range(1, len(LOG_scaled_train_list)):
        print('===================')
        print('Plot: {0} (out of {1})'.format(nbr, len(LOG_scaled_train_list)-1))
        print('===================')
        
        LOG_train_inputs = copy.deepcopy(LOG_scaled_train_list[i][::-1][:-1])       
        LOG_train_outputs = copy.deepcopy(LOG_scaled_train_list[i][::-1].iloc[1:, :])
        LOG_test_inputs = copy.deepcopy(LOG_scaled_test_list[i][::-1][:-1])
        LOG_test_outputs = copy.deepcopy(LOG_scaled_test_list[i][::-1].iloc[1:, :])
    
        # Random seed for reproducibility
        np.random.seed(45)
        # Build the model architecture
        LOG_model = logistic_regression_model(output_size=7, neurons=30)
        # Train the model
        trained_model = LOG_model.fit(LOG_train_inputs, LOG_train_outputs, epochs=20, 
                                      batch_size=1, verbose=2, shuffle=True, validation_split=0.05) 
        
        trainScore = LOG_model.evaluate(LOG_train_inputs, LOG_train_outputs, verbose=0)
        testScore = LOG_model.evaluate(LOG_test_inputs, LOG_test_outputs, verbose=0)
        print("Mean Squared Error on the training data: {0:0.5f}".format(trainScore))
        print("Mean Squared Error on the test data:     {0:0.5f}".format(testScore))
    
        df1 = pd.DataFrame(data=(np.transpose(LOG_model.predict(LOG_norm_train_list[i][:-1].values)))[0], 
                     index=LOG_train_inputs.index, columns=['Predictions on the Train set'])
        df2 = pd.DataFrame(data=(np.transpose(LOG_model.predict(LOG_norm_test_list[i][:-1].values)))[0],
                  index=LOG_test_inputs.index, columns=['Predictions on the Test set'])
        df3 = pd.DataFrame(data=norm_stock_prices[i+1].loc[:, 'Adj Close'])
        df3.columns = ['Actual Data']
        name = get_ticker(LOG_norm_train_list[i])
        plot_3zoom(df1[::-1], df2[::-1], df3, 
                   title='Logistic Regression Performance on the Training and Test Sets, ' + name, 
                   xlabel='Date', ylabel='Price', zoom=False)
        nbr += 1
        print('======================================================================================================')
        
    print('======================================================================================================')
    print('Total run time in seconds: {0:0.0f}'.format(time.time()-global_time))
    
In [33]:
plot_some_LOG_models()
===================
Plot: 1 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_2 (Dense)              (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0464 - val_loss: 0.0672
Epoch 2/20
 - 2s - loss: 0.0279 - val_loss: 0.0657
Epoch 3/20
 - 1s - loss: 0.0275 - val_loss: 0.0654
Epoch 4/20
 - 1s - loss: 0.0274 - val_loss: 0.0653
Epoch 5/20
 - 1s - loss: 0.0273 - val_loss: 0.0653
Epoch 6/20
 - 2s - loss: 0.0273 - val_loss: 0.0652
Epoch 7/20
 - 2s - loss: 0.0273 - val_loss: 0.0653
Epoch 8/20
 - 2s - loss: 0.0273 - val_loss: 0.0652
Epoch 9/20
 - 2s - loss: 0.0273 - val_loss: 0.0652
Epoch 10/20
 - 1s - loss: 0.0273 - val_loss: 0.0652
Epoch 11/20
 - 2s - loss: 0.0273 - val_loss: 0.0654
Epoch 12/20
 - 2s - loss: 0.0273 - val_loss: 0.0653
Epoch 13/20
 - 1s - loss: 0.0273 - val_loss: 0.0653
Epoch 14/20
 - 2s - loss: 0.0273 - val_loss: 0.0652
Epoch 15/20
 - 2s - loss: 0.0273 - val_loss: 0.0653
Epoch 16/20
 - 1s - loss: 0.0273 - val_loss: 0.0652
Epoch 17/20
 - 2s - loss: 0.0273 - val_loss: 0.0652
Epoch 18/20
 - 2s - loss: 0.0273 - val_loss: 0.0652
Epoch 19/20
 - 2s - loss: 0.0273 - val_loss: 0.0652
Epoch 20/20
 - 2s - loss: 0.0273 - val_loss: 0.0652
Mean Squared Error on the training data: 0.02922
Mean Squared Error on the test data:     0.09413
======================================================================================================
===================
Plot: 2 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_3 (Dense)              (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0111 - val_loss: 0.0032
Epoch 2/20
 - 2s - loss: 0.0027 - val_loss: 0.0021
Epoch 3/20
 - 2s - loss: 0.0025 - val_loss: 0.0021
Epoch 4/20
 - 2s - loss: 0.0024 - val_loss: 0.0021
Epoch 5/20
 - 2s - loss: 0.0024 - val_loss: 0.0022
Epoch 6/20
 - 2s - loss: 0.0024 - val_loss: 0.0021
Epoch 7/20
 - 2s - loss: 0.0024 - val_loss: 0.0026
Epoch 8/20
 - 2s - loss: 0.0024 - val_loss: 0.0020
Epoch 9/20
 - 2s - loss: 0.0024 - val_loss: 0.0019
Epoch 10/20
 - 2s - loss: 0.0024 - val_loss: 0.0020
Epoch 11/20
 - 1s - loss: 0.0024 - val_loss: 0.0020
Epoch 12/20
 - 2s - loss: 0.0024 - val_loss: 0.0021
Epoch 13/20
 - 2s - loss: 0.0024 - val_loss: 0.0021
Epoch 14/20
 - 2s - loss: 0.0024 - val_loss: 0.0023
Epoch 15/20
 - 2s - loss: 0.0024 - val_loss: 0.0021
Epoch 16/20
 - 2s - loss: 0.0024 - val_loss: 0.0020
Epoch 17/20
 - 2s - loss: 0.0023 - val_loss: 0.0019
Epoch 18/20
 - 2s - loss: 0.0023 - val_loss: 0.0019
Epoch 19/20
 - 2s - loss: 0.0023 - val_loss: 0.0019
Epoch 20/20
 - 2s - loss: 0.0023 - val_loss: 0.0020
Mean Squared Error on the training data: 0.00228
Mean Squared Error on the test data:     0.00440
======================================================================================================
===================
Plot: 3 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_4 (Dense)              (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0503 - val_loss: 0.0104
Epoch 2/20
 - 2s - loss: 0.0369 - val_loss: 0.0097
Epoch 3/20
 - 2s - loss: 0.0367 - val_loss: 0.0097
Epoch 4/20
 - 2s - loss: 0.0367 - val_loss: 0.0094
Epoch 5/20
 - 2s - loss: 0.0366 - val_loss: 0.0092
Epoch 6/20
 - 2s - loss: 0.0366 - val_loss: 0.0093
Epoch 7/20
 - 2s - loss: 0.0366 - val_loss: 0.0092
Epoch 8/20
 - 2s - loss: 0.0366 - val_loss: 0.0093
Epoch 9/20
 - 2s - loss: 0.0366 - val_loss: 0.0093
Epoch 10/20
 - 2s - loss: 0.0366 - val_loss: 0.0092
Epoch 11/20
 - 1s - loss: 0.0366 - val_loss: 0.0092
Epoch 12/20
 - 1s - loss: 0.0366 - val_loss: 0.0092
Epoch 13/20
 - 1s - loss: 0.0366 - val_loss: 0.0093
Epoch 14/20
 - 1s - loss: 0.0366 - val_loss: 0.0091
Epoch 15/20
 - 1s - loss: 0.0366 - val_loss: 0.0093
Epoch 16/20
 - 1s - loss: 0.0366 - val_loss: 0.0092
Epoch 17/20
 - 2s - loss: 0.0366 - val_loss: 0.0092
Epoch 18/20
 - 1s - loss: 0.0366 - val_loss: 0.0092
Epoch 19/20
 - 1s - loss: 0.0366 - val_loss: 0.0092
Epoch 20/20
 - 1s - loss: 0.0366 - val_loss: 0.0091
Mean Squared Error on the training data: 0.03515
Mean Squared Error on the test data:     0.00221
======================================================================================================
===================
Plot: 4 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_5 (Dense)              (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0436 - val_loss: 0.1980
Epoch 2/20
 - 2s - loss: 0.0394 - val_loss: 0.1973
Epoch 3/20
 - 1s - loss: 0.0393 - val_loss: 0.1972
Epoch 4/20
 - 1s - loss: 0.0393 - val_loss: 0.1977
Epoch 5/20
 - 1s - loss: 0.0393 - val_loss: 0.1979
Epoch 6/20
 - 1s - loss: 0.0393 - val_loss: 0.1973
Epoch 7/20
 - 1s - loss: 0.0393 - val_loss: 0.1977
Epoch 8/20
 - 1s - loss: 0.0393 - val_loss: 0.1977
Epoch 9/20
 - 1s - loss: 0.0393 - val_loss: 0.1972
Epoch 10/20
 - 1s - loss: 0.0393 - val_loss: 0.1972
Epoch 11/20
 - 1s - loss: 0.0393 - val_loss: 0.1974
Epoch 12/20
 - 1s - loss: 0.0393 - val_loss: 0.1974
Epoch 13/20
 - 1s - loss: 0.0393 - val_loss: 0.1972
Epoch 14/20
 - 1s - loss: 0.0393 - val_loss: 0.1974
Epoch 15/20
 - 2s - loss: 0.0393 - val_loss: 0.1972
Epoch 16/20
 - 1s - loss: 0.0393 - val_loss: 0.1973
Epoch 17/20
 - 1s - loss: 0.0393 - val_loss: 0.1971
Epoch 18/20
 - 1s - loss: 0.0393 - val_loss: 0.1972
Epoch 19/20
 - 1s - loss: 0.0393 - val_loss: 0.1973
Epoch 20/20
 - 1s - loss: 0.0393 - val_loss: 0.1973
Mean Squared Error on the training data: 0.04732
Mean Squared Error on the test data:     0.22329
======================================================================================================
===================
Plot: 5 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_6 (Dense)              (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 970 samples, validate on 52 samples
Epoch 1/20
 - 2s - loss: 0.0643 - val_loss: 0.2112
Epoch 2/20
 - 1s - loss: 0.0566 - val_loss: 0.1098
Epoch 3/20
 - 1s - loss: 0.0293 - val_loss: 0.1052
Epoch 4/20
 - 1s - loss: 0.0291 - val_loss: 0.1050
Epoch 5/20
 - 1s - loss: 0.0291 - val_loss: 0.1050
Epoch 6/20
 - 1s - loss: 0.0290 - val_loss: 0.1051
Epoch 7/20
 - 1s - loss: 0.0290 - val_loss: 0.1049
Epoch 8/20
 - 1s - loss: 0.0290 - val_loss: 0.1051
Epoch 9/20
 - 1s - loss: 0.0290 - val_loss: 0.1056
Epoch 10/20
 - 1s - loss: 0.0290 - val_loss: 0.1049
Epoch 11/20
 - 1s - loss: 0.0290 - val_loss: 0.1050
Epoch 12/20
 - 1s - loss: 0.0290 - val_loss: 0.1050
Epoch 13/20
 - 1s - loss: 0.0289 - val_loss: 0.1049
Epoch 14/20
 - 1s - loss: 0.0289 - val_loss: 0.1049
Epoch 15/20
 - 1s - loss: 0.0289 - val_loss: 0.1049
Epoch 16/20
 - 2s - loss: 0.0289 - val_loss: 0.1050
Epoch 17/20
 - 3s - loss: 0.0289 - val_loss: 0.1049
Epoch 18/20
 - 3s - loss: 0.0289 - val_loss: 0.1050
Epoch 19/20
 - 2s - loss: 0.0289 - val_loss: 0.1049
Epoch 20/20
 - 2s - loss: 0.0289 - val_loss: 0.1050
Mean Squared Error on the training data: 0.03281
Mean Squared Error on the test data:     0.11624
======================================================================================================
===================
Plot: 6 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_7 (Dense)              (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0427 - val_loss: 0.0849
Epoch 2/20
 - 2s - loss: 0.0291 - val_loss: 0.0820
Epoch 3/20
 - 2s - loss: 0.0286 - val_loss: 0.0815
Epoch 4/20
 - 2s - loss: 0.0285 - val_loss: 0.0814
Epoch 5/20
 - 1s - loss: 0.0284 - val_loss: 0.0813
Epoch 6/20
 - 2s - loss: 0.0283 - val_loss: 0.0823
Epoch 7/20
 - 2s - loss: 0.0283 - val_loss: 0.0811
Epoch 8/20
 - 1s - loss: 0.0283 - val_loss: 0.0811
Epoch 9/20
 - 1s - loss: 0.0283 - val_loss: 0.0810
Epoch 10/20
 - 1s - loss: 0.0283 - val_loss: 0.0811
Epoch 11/20
 - 1s - loss: 0.0283 - val_loss: 0.0810
Epoch 12/20
 - 1s - loss: 0.0283 - val_loss: 0.0813
Epoch 13/20
 - 1s - loss: 0.0283 - val_loss: 0.0812
Epoch 14/20
 - 1s - loss: 0.0282 - val_loss: 0.0813
Epoch 15/20
 - 2s - loss: 0.0282 - val_loss: 0.0810
Epoch 16/20
 - 2s - loss: 0.0282 - val_loss: 0.0810
Epoch 17/20
 - 1s - loss: 0.0282 - val_loss: 0.0810
Epoch 18/20
 - 1s - loss: 0.0282 - val_loss: 0.0814
Epoch 19/20
 - 1s - loss: 0.0282 - val_loss: 0.0810
Epoch 20/20
 - 1s - loss: 0.0282 - val_loss: 0.0811
Mean Squared Error on the training data: 0.03097
Mean Squared Error on the test data:     0.10949
======================================================================================================
===================
Plot: 7 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_8 (Dense)              (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0188 - val_loss: 0.0347
Epoch 2/20
 - 1s - loss: 0.0138 - val_loss: 0.0342
Epoch 3/20
 - 2s - loss: 0.0137 - val_loss: 0.0344
Epoch 4/20
 - 1s - loss: 0.0136 - val_loss: 0.0343
Epoch 5/20
 - 1s - loss: 0.0136 - val_loss: 0.0344
Epoch 6/20
 - 1s - loss: 0.0136 - val_loss: 0.0342
Epoch 7/20
 - 1s - loss: 0.0136 - val_loss: 0.0340
Epoch 8/20
 - 1s - loss: 0.0136 - val_loss: 0.0350
Epoch 9/20
 - 1s - loss: 0.0135 - val_loss: 0.0342
Epoch 10/20
 - 1s - loss: 0.0136 - val_loss: 0.0344
Epoch 11/20
 - 1s - loss: 0.0135 - val_loss: 0.0345
Epoch 12/20
 - 1s - loss: 0.0135 - val_loss: 0.0344
Epoch 13/20
 - 1s - loss: 0.0135 - val_loss: 0.0344
Epoch 14/20
 - 1s - loss: 0.0135 - val_loss: 0.0349
Epoch 15/20
 - 1s - loss: 0.0135 - val_loss: 0.0343
Epoch 16/20
 - 1s - loss: 0.0135 - val_loss: 0.0344
Epoch 17/20
 - 2s - loss: 0.0135 - val_loss: 0.0346
Epoch 18/20
 - 2s - loss: 0.0135 - val_loss: 0.0342
Epoch 19/20
 - 1s - loss: 0.0135 - val_loss: 0.0344
Epoch 20/20
 - 1s - loss: 0.0135 - val_loss: 0.0344
Mean Squared Error on the training data: 0.01458
Mean Squared Error on the test data:     0.07342
======================================================================================================
===================
Plot: 8 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_9 (Dense)              (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0367 - val_loss: 0.0881
Epoch 2/20
 - 1s - loss: 0.0210 - val_loss: 0.0849
Epoch 3/20
 - 1s - loss: 0.0208 - val_loss: 0.0849
Epoch 4/20
 - 1s - loss: 0.0207 - val_loss: 0.0850
Epoch 5/20
 - 1s - loss: 0.0206 - val_loss: 0.0850
Epoch 6/20
 - 1s - loss: 0.0206 - val_loss: 0.0848
Epoch 7/20
 - 1s - loss: 0.0206 - val_loss: 0.0847
Epoch 8/20
 - 2s - loss: 0.0205 - val_loss: 0.0846
Epoch 9/20
 - 2s - loss: 0.0206 - val_loss: 0.0848
Epoch 10/20
 - 2s - loss: 0.0205 - val_loss: 0.0847
Epoch 11/20
 - 1s - loss: 0.0205 - val_loss: 0.0851
Epoch 12/20
 - 1s - loss: 0.0205 - val_loss: 0.0848
Epoch 13/20
 - 1s - loss: 0.0205 - val_loss: 0.0847
Epoch 14/20
 - 1s - loss: 0.0205 - val_loss: 0.0852
Epoch 15/20
 - 1s - loss: 0.0205 - val_loss: 0.0848
Epoch 16/20
 - 1s - loss: 0.0205 - val_loss: 0.0847
Epoch 17/20
 - 1s - loss: 0.0205 - val_loss: 0.0848
Epoch 18/20
 - 1s - loss: 0.0205 - val_loss: 0.0852
Epoch 19/20
 - 1s - loss: 0.0205 - val_loss: 0.0848
Epoch 20/20
 - 1s - loss: 0.0205 - val_loss: 0.0849
Mean Squared Error on the training data: 0.02376
Mean Squared Error on the test data:     0.09379
======================================================================================================
===================
Plot: 9 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_10 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0080 - val_loss: 0.0286
Epoch 2/20
 - 1s - loss: 0.0058 - val_loss: 0.0282
Epoch 3/20
 - 2s - loss: 0.0057 - val_loss: 0.0281
Epoch 4/20
 - 1s - loss: 0.0057 - val_loss: 0.0281
Epoch 5/20
 - 1s - loss: 0.0057 - val_loss: 0.0285
Epoch 6/20
 - 1s - loss: 0.0057 - val_loss: 0.0285
Epoch 7/20
 - 1s - loss: 0.0056 - val_loss: 0.0281
Epoch 8/20
 - 1s - loss: 0.0056 - val_loss: 0.0282
Epoch 9/20
 - 1s - loss: 0.0056 - val_loss: 0.0280
Epoch 10/20
 - 1s - loss: 0.0056 - val_loss: 0.0284
Epoch 11/20
 - 2s - loss: 0.0056 - val_loss: 0.0281
Epoch 12/20
 - 2s - loss: 0.0056 - val_loss: 0.0282
Epoch 13/20
 - 2s - loss: 0.0056 - val_loss: 0.0280
Epoch 14/20
 - 1s - loss: 0.0056 - val_loss: 0.0284
Epoch 15/20
 - 1s - loss: 0.0056 - val_loss: 0.0283
Epoch 16/20
 - 2s - loss: 0.0056 - val_loss: 0.0280
Epoch 17/20
 - 1s - loss: 0.0056 - val_loss: 0.0281
Epoch 18/20
 - 1s - loss: 0.0056 - val_loss: 0.0281
Epoch 19/20
 - 1s - loss: 0.0056 - val_loss: 0.0280
Epoch 20/20
 - 1s - loss: 0.0056 - val_loss: 0.0280
Mean Squared Error on the training data: 0.00674
Mean Squared Error on the test data:     0.07180
======================================================================================================
===================
Plot: 10 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_11 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0680 - val_loss: 0.0595
Epoch 2/20
 - 1s - loss: 0.0431 - val_loss: 0.0590
Epoch 3/20
 - 1s - loss: 0.0427 - val_loss: 0.0592
Epoch 4/20
 - 1s - loss: 0.0426 - val_loss: 0.0590
Epoch 5/20
 - 1s - loss: 0.0426 - val_loss: 0.0592
Epoch 6/20
 - 1s - loss: 0.0425 - val_loss: 0.0589
Epoch 7/20
 - 1s - loss: 0.0425 - val_loss: 0.0589
Epoch 8/20
 - 1s - loss: 0.0425 - val_loss: 0.0589
Epoch 9/20
 - 1s - loss: 0.0425 - val_loss: 0.0588
Epoch 10/20
 - 2s - loss: 0.0425 - val_loss: 0.0592
Epoch 11/20
 - 1s - loss: 0.0425 - val_loss: 0.0591
Epoch 12/20
 - 1s - loss: 0.0425 - val_loss: 0.0592
Epoch 13/20
 - 1s - loss: 0.0425 - val_loss: 0.0586
Epoch 14/20
 - 1s - loss: 0.0425 - val_loss: 0.0588
Epoch 15/20
 - 1s - loss: 0.0425 - val_loss: 0.0589
Epoch 16/20
 - 1s - loss: 0.0424 - val_loss: 0.0586
Epoch 17/20
 - 1s - loss: 0.0425 - val_loss: 0.0585
Epoch 18/20
 - 1s - loss: 0.0425 - val_loss: 0.0585
Epoch 19/20
 - 1s - loss: 0.0425 - val_loss: 0.0586
Epoch 20/20
 - 1s - loss: 0.0424 - val_loss: 0.0587
Mean Squared Error on the training data: 0.04322
Mean Squared Error on the test data:     0.06085
======================================================================================================
===================
Plot: 11 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_12 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0236 - val_loss: 0.0021
Epoch 2/20
 - 2s - loss: 0.0030 - val_loss: 0.0014
Epoch 3/20
 - 2s - loss: 0.0027 - val_loss: 0.0013
Epoch 4/20
 - 2s - loss: 0.0027 - val_loss: 0.0014
Epoch 5/20
 - 2s - loss: 0.0026 - val_loss: 0.0016
Epoch 6/20
 - 2s - loss: 0.0026 - val_loss: 0.0017
Epoch 7/20
 - 2s - loss: 0.0026 - val_loss: 0.0015
Epoch 8/20
 - 2s - loss: 0.0026 - val_loss: 0.0014
Epoch 9/20
 - 2s - loss: 0.0026 - val_loss: 0.0013
Epoch 10/20
 - 2s - loss: 0.0025 - val_loss: 0.0015
Epoch 11/20
 - 2s - loss: 0.0025 - val_loss: 0.0018
Epoch 12/20
 - 2s - loss: 0.0025 - val_loss: 0.0016
Epoch 13/20
 - 2s - loss: 0.0025 - val_loss: 0.0012
Epoch 14/20
 - 2s - loss: 0.0025 - val_loss: 0.0016
Epoch 15/20
 - 2s - loss: 0.0025 - val_loss: 0.0015
Epoch 16/20
 - 2s - loss: 0.0025 - val_loss: 0.0013
Epoch 17/20
 - 2s - loss: 0.0025 - val_loss: 0.0015
Epoch 18/20
 - 2s - loss: 0.0025 - val_loss: 0.0013
Epoch 19/20
 - 2s - loss: 0.0025 - val_loss: 0.0012
Epoch 20/20
 - 2s - loss: 0.0025 - val_loss: 0.0015
Mean Squared Error on the training data: 0.00245
Mean Squared Error on the test data:     0.00175
======================================================================================================
===================
Plot: 12 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_13 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0355 - val_loss: 0.1070
Epoch 2/20
 - 2s - loss: 0.0210 - val_loss: 0.1060
Epoch 3/20
 - 2s - loss: 0.0209 - val_loss: 0.1066
Epoch 4/20
 - 2s - loss: 0.0208 - val_loss: 0.1067
Epoch 5/20
 - 1s - loss: 0.0208 - val_loss: 0.1061
Epoch 6/20
 - 2s - loss: 0.0207 - val_loss: 0.1074
Epoch 7/20
 - 2s - loss: 0.0207 - val_loss: 0.1067
Epoch 8/20
 - 2s - loss: 0.0207 - val_loss: 0.1072
Epoch 9/20
 - 2s - loss: 0.0207 - val_loss: 0.1063
Epoch 10/20
 - 1s - loss: 0.0207 - val_loss: 0.1063
Epoch 11/20
 - 2s - loss: 0.0207 - val_loss: 0.1074
Epoch 12/20
 - 2s - loss: 0.0207 - val_loss: 0.1062
Epoch 13/20
 - 1s - loss: 0.0207 - val_loss: 0.1057
Epoch 14/20
 - 2s - loss: 0.0207 - val_loss: 0.1068
Epoch 15/20
 - 2s - loss: 0.0207 - val_loss: 0.1071
Epoch 16/20
 - 1s - loss: 0.0207 - val_loss: 0.1059
Epoch 17/20
 - 2s - loss: 0.0207 - val_loss: 0.1063
Epoch 18/20
 - 1s - loss: 0.0207 - val_loss: 0.1063
Epoch 19/20
 - 1s - loss: 0.0207 - val_loss: 0.1061
Epoch 20/20
 - 1s - loss: 0.0207 - val_loss: 0.1065
Mean Squared Error on the training data: 0.02502
Mean Squared Error on the test data:     0.10768
======================================================================================================
===================
Plot: 13 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_14 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0183 - val_loss: 0.0448
Epoch 2/20
 - 2s - loss: 0.0089 - val_loss: 0.0443
Epoch 3/20
 - 2s - loss: 0.0089 - val_loss: 0.0444
Epoch 4/20
 - 2s - loss: 0.0089 - val_loss: 0.0443
Epoch 5/20
 - 2s - loss: 0.0089 - val_loss: 0.0444
Epoch 6/20
 - 2s - loss: 0.0089 - val_loss: 0.0444
Epoch 7/20
 - 2s - loss: 0.0089 - val_loss: 0.0445
Epoch 8/20
 - 2s - loss: 0.0089 - val_loss: 0.0443
Epoch 9/20
 - 2s - loss: 0.0089 - val_loss: 0.0444
Epoch 10/20
 - 2s - loss: 0.0089 - val_loss: 0.0444
Epoch 11/20
 - 2s - loss: 0.0089 - val_loss: 0.0443
Epoch 12/20
 - 2s - loss: 0.0089 - val_loss: 0.0443
Epoch 13/20
 - 2s - loss: 0.0089 - val_loss: 0.0444
Epoch 14/20
 - 2s - loss: 0.0089 - val_loss: 0.0444
Epoch 15/20
 - 2s - loss: 0.0089 - val_loss: 0.0444
Epoch 16/20
 - 2s - loss: 0.0089 - val_loss: 0.0443
Epoch 17/20
 - 2s - loss: 0.0089 - val_loss: 0.0443
Epoch 18/20
 - 2s - loss: 0.0089 - val_loss: 0.0443
Epoch 19/20
 - 2s - loss: 0.0089 - val_loss: 0.0443
Epoch 20/20
 - 2s - loss: 0.0089 - val_loss: 0.0443
Mean Squared Error on the training data: 0.01068
Mean Squared Error on the test data:     0.03360
======================================================================================================
===================
Plot: 14 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_15 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0727 - val_loss: 0.1283
Epoch 2/20
 - 2s - loss: 0.0377 - val_loss: 0.1271
Epoch 3/20
 - 2s - loss: 0.0375 - val_loss: 0.1271
Epoch 4/20
 - 2s - loss: 0.0374 - val_loss: 0.1269
Epoch 5/20
 - 2s - loss: 0.0374 - val_loss: 0.1274
Epoch 6/20
 - 2s - loss: 0.0374 - val_loss: 0.1270
Epoch 7/20
 - 2s - loss: 0.0374 - val_loss: 0.1270
Epoch 8/20
 - 2s - loss: 0.0374 - val_loss: 0.1278
Epoch 9/20
 - 1s - loss: 0.0374 - val_loss: 0.1270
Epoch 10/20
 - 2s - loss: 0.0374 - val_loss: 0.1271
Epoch 11/20
 - 2s - loss: 0.0374 - val_loss: 0.1273
Epoch 12/20
 - 2s - loss: 0.0374 - val_loss: 0.1272
Epoch 13/20
 - 2s - loss: 0.0374 - val_loss: 0.1276
Epoch 14/20
 - 2s - loss: 0.0374 - val_loss: 0.1273
Epoch 15/20
 - 2s - loss: 0.0374 - val_loss: 0.1270
Epoch 16/20
 - 2s - loss: 0.0374 - val_loss: 0.1270
Epoch 17/20
 - 2s - loss: 0.0374 - val_loss: 0.1272
Epoch 18/20
 - 2s - loss: 0.0374 - val_loss: 0.1270
Epoch 19/20
 - 2s - loss: 0.0374 - val_loss: 0.1270
Epoch 20/20
 - 2s - loss: 0.0374 - val_loss: 0.1277
Mean Squared Error on the training data: 0.04206
Mean Squared Error on the test data:     0.09335
======================================================================================================
===================
Plot: 15 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_16 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0640 - val_loss: 0.0603
Epoch 2/20
 - 2s - loss: 0.0311 - val_loss: 0.0599
Epoch 3/20
 - 2s - loss: 0.0309 - val_loss: 0.0598
Epoch 4/20
 - 2s - loss: 0.0308 - val_loss: 0.0598
Epoch 5/20
 - 2s - loss: 0.0308 - val_loss: 0.0601
Epoch 6/20
 - 2s - loss: 0.0308 - val_loss: 0.0598
Epoch 7/20
 - 2s - loss: 0.0308 - val_loss: 0.0598
Epoch 8/20
 - 2s - loss: 0.0307 - val_loss: 0.0598
Epoch 9/20
 - 2s - loss: 0.0307 - val_loss: 0.0599
Epoch 10/20
 - 2s - loss: 0.0307 - val_loss: 0.0597
Epoch 11/20
 - 2s - loss: 0.0307 - val_loss: 0.0597
Epoch 12/20
 - 2s - loss: 0.0307 - val_loss: 0.0597
Epoch 13/20
 - 2s - loss: 0.0307 - val_loss: 0.0597
Epoch 14/20
 - 2s - loss: 0.0307 - val_loss: 0.0597
Epoch 15/20
 - 2s - loss: 0.0307 - val_loss: 0.0596
Epoch 16/20
 - 2s - loss: 0.0307 - val_loss: 0.0598
Epoch 17/20
 - 2s - loss: 0.0307 - val_loss: 0.0597
Epoch 18/20
 - 2s - loss: 0.0307 - val_loss: 0.0597
Epoch 19/20
 - 2s - loss: 0.0307 - val_loss: 0.0597
Epoch 20/20
 - 2s - loss: 0.0307 - val_loss: 0.0597
Mean Squared Error on the training data: 0.03215
Mean Squared Error on the test data:     0.09107
======================================================================================================
===================
Plot: 16 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_17 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0495 - val_loss: 0.0044
Epoch 2/20
 - 2s - loss: 0.0283 - val_loss: 0.0021
Epoch 3/20
 - 2s - loss: 0.0280 - val_loss: 0.0020
Epoch 4/20
 - 2s - loss: 0.0279 - val_loss: 0.0021
Epoch 5/20
 - 2s - loss: 0.0279 - val_loss: 0.0021
Epoch 6/20
 - 2s - loss: 0.0278 - val_loss: 0.0021
Epoch 7/20
 - 2s - loss: 0.0278 - val_loss: 0.0021
Epoch 8/20
 - 2s - loss: 0.0278 - val_loss: 0.0020
Epoch 9/20
 - 2s - loss: 0.0278 - val_loss: 0.0020
Epoch 10/20
 - 2s - loss: 0.0278 - val_loss: 0.0020
Epoch 11/20
 - 2s - loss: 0.0277 - val_loss: 0.0020
Epoch 12/20
 - 2s - loss: 0.0277 - val_loss: 0.0021
Epoch 13/20
 - 2s - loss: 0.0277 - val_loss: 0.0020
Epoch 14/20
 - 2s - loss: 0.0277 - val_loss: 0.0020
Epoch 15/20
 - 2s - loss: 0.0277 - val_loss: 0.0020
Epoch 16/20
 - 2s - loss: 0.0277 - val_loss: 0.0020
Epoch 17/20
 - 2s - loss: 0.0277 - val_loss: 0.0020
Epoch 18/20
 - 2s - loss: 0.0277 - val_loss: 0.0021
Epoch 19/20
 - 2s - loss: 0.0277 - val_loss: 0.0021
Epoch 20/20
 - 2s - loss: 0.0277 - val_loss: 0.0021
Mean Squared Error on the training data: 0.02635
Mean Squared Error on the test data:     0.02486
======================================================================================================
===================
Plot: 17 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_18 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0147 - val_loss: 0.0711
Epoch 2/20
 - 2s - loss: 0.0089 - val_loss: 0.0362
Epoch 3/20
 - 2s - loss: 0.0073 - val_loss: 0.0362
Epoch 4/20
 - 2s - loss: 0.0072 - val_loss: 0.0361
Epoch 5/20
 - 2s - loss: 0.0072 - val_loss: 0.0363
Epoch 6/20
 - 2s - loss: 0.0072 - val_loss: 0.0360
Epoch 7/20
 - 2s - loss: 0.0072 - val_loss: 0.0359
Epoch 8/20
 - 2s - loss: 0.0072 - val_loss: 0.0362
Epoch 9/20
 - 2s - loss: 0.0072 - val_loss: 0.0363
Epoch 10/20
 - 2s - loss: 0.0072 - val_loss: 0.0365
Epoch 11/20
 - 2s - loss: 0.0072 - val_loss: 0.0363
Epoch 12/20
 - 2s - loss: 0.0072 - val_loss: 0.0363
Epoch 13/20
 - 2s - loss: 0.0072 - val_loss: 0.0364
Epoch 14/20
 - 2s - loss: 0.0072 - val_loss: 0.0368
Epoch 15/20
 - 2s - loss: 0.0072 - val_loss: 0.0361
Epoch 16/20
 - 2s - loss: 0.0072 - val_loss: 0.0362
Epoch 17/20
 - 2s - loss: 0.0072 - val_loss: 0.0359
Epoch 18/20
 - 2s - loss: 0.0072 - val_loss: 0.0360
Epoch 19/20
 - 2s - loss: 0.0072 - val_loss: 0.0362
Epoch 20/20
 - 2s - loss: 0.0072 - val_loss: 0.0363
Mean Squared Error on the training data: 0.00866
Mean Squared Error on the test data:     0.09597
======================================================================================================
===================
Plot: 18 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_19 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0419 - val_loss: 0.0569
Epoch 2/20
 - 2s - loss: 0.0263 - val_loss: 0.0562
Epoch 3/20
 - 2s - loss: 0.0262 - val_loss: 0.0562
Epoch 4/20
 - 2s - loss: 0.0261 - val_loss: 0.0562
Epoch 5/20
 - 2s - loss: 0.0261 - val_loss: 0.0564
Epoch 6/20
 - 2s - loss: 0.0261 - val_loss: 0.0562
Epoch 7/20
 - 2s - loss: 0.0261 - val_loss: 0.0562
Epoch 8/20
 - 2s - loss: 0.0260 - val_loss: 0.0563
Epoch 9/20
 - 2s - loss: 0.0260 - val_loss: 0.0561
Epoch 10/20
 - 2s - loss: 0.0260 - val_loss: 0.0562
Epoch 11/20
 - 2s - loss: 0.0260 - val_loss: 0.0562
Epoch 12/20
 - 2s - loss: 0.0260 - val_loss: 0.0562
Epoch 13/20
 - 2s - loss: 0.0260 - val_loss: 0.0562
Epoch 14/20
 - 2s - loss: 0.0260 - val_loss: 0.0561
Epoch 15/20
 - 2s - loss: 0.0260 - val_loss: 0.0562
Epoch 16/20
 - 2s - loss: 0.0260 - val_loss: 0.0562
Epoch 17/20
 - 2s - loss: 0.0260 - val_loss: 0.0562
Epoch 18/20
 - 2s - loss: 0.0260 - val_loss: 0.0562
Epoch 19/20
 - 2s - loss: 0.0260 - val_loss: 0.0562
Epoch 20/20
 - 2s - loss: 0.0260 - val_loss: 0.0562
Mean Squared Error on the training data: 0.02759
Mean Squared Error on the test data:     0.08174
======================================================================================================
===================
Plot: 19 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_20 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0616 - val_loss: 0.0637
Epoch 2/20
 - 2s - loss: 0.0410 - val_loss: 0.0638
Epoch 3/20
 - 2s - loss: 0.0406 - val_loss: 0.0630
Epoch 4/20
 - 2s - loss: 0.0405 - val_loss: 0.0622
Epoch 5/20
 - 2s - loss: 0.0404 - val_loss: 0.0626
Epoch 6/20
 - 2s - loss: 0.0404 - val_loss: 0.0624
Epoch 7/20
 - 2s - loss: 0.0403 - val_loss: 0.0620
Epoch 8/20
 - 2s - loss: 0.0403 - val_loss: 0.0620
Epoch 9/20
 - 2s - loss: 0.0403 - val_loss: 0.0617
Epoch 10/20
 - 2s - loss: 0.0403 - val_loss: 0.0619
Epoch 11/20
 - 2s - loss: 0.0403 - val_loss: 0.0620
Epoch 12/20
 - 2s - loss: 0.0402 - val_loss: 0.0622
Epoch 13/20
 - 2s - loss: 0.0403 - val_loss: 0.0614
Epoch 14/20
 - 2s - loss: 0.0402 - val_loss: 0.0618
Epoch 15/20
 - 2s - loss: 0.0402 - val_loss: 0.0617
Epoch 16/20
 - 2s - loss: 0.0402 - val_loss: 0.0616
Epoch 17/20
 - 2s - loss: 0.0402 - val_loss: 0.0613
Epoch 18/20
 - 2s - loss: 0.0402 - val_loss: 0.0614
Epoch 19/20
 - 2s - loss: 0.0402 - val_loss: 0.0612
Epoch 20/20
 - 2s - loss: 0.0402 - val_loss: 0.0614
Mean Squared Error on the training data: 0.04130
Mean Squared Error on the test data:     0.08361
======================================================================================================
===================
Plot: 20 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_21 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0309 - val_loss: 0.0562
Epoch 2/20
 - 2s - loss: 0.0232 - val_loss: 0.0554
Epoch 3/20
 - 2s - loss: 0.0229 - val_loss: 0.0554
Epoch 4/20
 - 2s - loss: 0.0228 - val_loss: 0.0553
Epoch 5/20
 - 2s - loss: 0.0227 - val_loss: 0.0554
Epoch 6/20
 - 2s - loss: 0.0227 - val_loss: 0.0553
Epoch 7/20
 - 2s - loss: 0.0226 - val_loss: 0.0554
Epoch 8/20
 - 2s - loss: 0.0226 - val_loss: 0.0554
Epoch 9/20
 - 2s - loss: 0.0226 - val_loss: 0.0552
Epoch 10/20
 - 2s - loss: 0.0226 - val_loss: 0.0553
Epoch 11/20
 - 2s - loss: 0.0226 - val_loss: 0.0552
Epoch 12/20
 - 2s - loss: 0.0226 - val_loss: 0.0553
Epoch 13/20
 - 2s - loss: 0.0226 - val_loss: 0.0552
Epoch 14/20
 - 2s - loss: 0.0226 - val_loss: 0.0554
Epoch 15/20
 - 2s - loss: 0.0226 - val_loss: 0.0553
Epoch 16/20
 - 2s - loss: 0.0226 - val_loss: 0.0553
Epoch 17/20
 - 2s - loss: 0.0226 - val_loss: 0.0553
Epoch 18/20
 - 2s - loss: 0.0226 - val_loss: 0.0552
Epoch 19/20
 - 2s - loss: 0.0226 - val_loss: 0.0552
Epoch 20/20
 - 2s - loss: 0.0226 - val_loss: 0.0552
Mean Squared Error on the training data: 0.02424
Mean Squared Error on the test data:     0.09734
======================================================================================================
===================
Plot: 21 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_22 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0224 - val_loss: 0.0563
Epoch 2/20
 - 2s - loss: 0.0165 - val_loss: 0.0547
Epoch 3/20
 - 2s - loss: 0.0161 - val_loss: 0.0546
Epoch 4/20
 - 2s - loss: 0.0160 - val_loss: 0.0544
Epoch 5/20
 - 2s - loss: 0.0159 - val_loss: 0.0544
Epoch 6/20
 - 2s - loss: 0.0158 - val_loss: 0.0542
Epoch 7/20
 - 2s - loss: 0.0158 - val_loss: 0.0543
Epoch 8/20
 - 2s - loss: 0.0158 - val_loss: 0.0543
Epoch 9/20
 - 2s - loss: 0.0158 - val_loss: 0.0542
Epoch 10/20
 - 2s - loss: 0.0158 - val_loss: 0.0544
Epoch 11/20
 - 2s - loss: 0.0158 - val_loss: 0.0543
Epoch 12/20
 - 2s - loss: 0.0158 - val_loss: 0.0544
Epoch 13/20
 - 2s - loss: 0.0158 - val_loss: 0.0542
Epoch 14/20
 - 2s - loss: 0.0158 - val_loss: 0.0544
Epoch 15/20
 - 2s - loss: 0.0158 - val_loss: 0.0543
Epoch 16/20
 - 2s - loss: 0.0158 - val_loss: 0.0542
Epoch 17/20
 - 2s - loss: 0.0158 - val_loss: 0.0543
Epoch 18/20
 - 2s - loss: 0.0158 - val_loss: 0.0542
Epoch 19/20
 - 2s - loss: 0.0158 - val_loss: 0.0542
Epoch 20/20
 - 2s - loss: 0.0158 - val_loss: 0.0543
Mean Squared Error on the training data: 0.01771
Mean Squared Error on the test data:     0.09357
======================================================================================================
===================
Plot: 22 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_23 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0601 - val_loss: 0.0568
Epoch 2/20
 - 2s - loss: 0.0340 - val_loss: 0.0554
Epoch 3/20
 - 2s - loss: 0.0337 - val_loss: 0.0542
Epoch 4/20
 - 2s - loss: 0.0336 - val_loss: 0.0536
Epoch 5/20
 - 2s - loss: 0.0335 - val_loss: 0.0536
Epoch 6/20
 - 2s - loss: 0.0334 - val_loss: 0.0532
Epoch 7/20
 - 2s - loss: 0.0334 - val_loss: 0.0532
Epoch 8/20
 - 2s - loss: 0.0333 - val_loss: 0.0530
Epoch 9/20
 - 2s - loss: 0.0333 - val_loss: 0.0529
Epoch 10/20
 - 2s - loss: 0.0333 - val_loss: 0.0528
Epoch 11/20
 - 2s - loss: 0.0333 - val_loss: 0.0528
Epoch 12/20
 - 2s - loss: 0.0333 - val_loss: 0.0527
Epoch 13/20
 - 2s - loss: 0.0333 - val_loss: 0.0527
Epoch 14/20
 - 2s - loss: 0.0332 - val_loss: 0.0527
Epoch 15/20
 - 2s - loss: 0.0332 - val_loss: 0.0527
Epoch 16/20
 - 2s - loss: 0.0332 - val_loss: 0.0527
Epoch 17/20
 - 2s - loss: 0.0332 - val_loss: 0.0527
Epoch 18/20
 - 2s - loss: 0.0332 - val_loss: 0.0526
Epoch 19/20
 - 2s - loss: 0.0332 - val_loss: 0.0527
Epoch 20/20
 - 2s - loss: 0.0332 - val_loss: 0.0527
Mean Squared Error on the training data: 0.03419
Mean Squared Error on the test data:     0.08589
======================================================================================================
===================
Plot: 23 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_24 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0448 - val_loss: 0.0022
Epoch 2/20
 - 2s - loss: 0.0028 - val_loss: 0.0013
Epoch 3/20
 - 2s - loss: 0.0024 - val_loss: 0.0012
Epoch 4/20
 - 2s - loss: 0.0023 - val_loss: 0.0011
Epoch 5/20
 - 2s - loss: 0.0022 - val_loss: 0.0016
Epoch 6/20
 - 2s - loss: 0.0022 - val_loss: 0.0014
Epoch 7/20
 - 2s - loss: 0.0022 - val_loss: 0.0014
Epoch 8/20
 - 2s - loss: 0.0022 - val_loss: 0.0014
Epoch 9/20
 - 2s - loss: 0.0021 - val_loss: 0.0011
Epoch 10/20
 - 2s - loss: 0.0021 - val_loss: 0.0012
Epoch 11/20
 - 2s - loss: 0.0021 - val_loss: 0.0011
Epoch 12/20
 - 2s - loss: 0.0021 - val_loss: 0.0016
Epoch 13/20
 - 2s - loss: 0.0021 - val_loss: 0.0016
Epoch 14/20
 - 2s - loss: 0.0021 - val_loss: 0.0013
Epoch 15/20
 - 2s - loss: 0.0021 - val_loss: 0.0018
Epoch 16/20
 - 2s - loss: 0.0021 - val_loss: 0.0012
Epoch 17/20
 - 2s - loss: 0.0021 - val_loss: 0.0016
Epoch 18/20
 - 2s - loss: 0.0020 - val_loss: 0.0013
Epoch 19/20
 - 2s - loss: 0.0021 - val_loss: 8.2131e-04
Epoch 20/20
 - 2s - loss: 0.0021 - val_loss: 0.0012
Mean Squared Error on the training data: 0.00196
Mean Squared Error on the test data:     0.00196
======================================================================================================
===================
Plot: 24 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_25 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0412 - val_loss: 0.1165
Epoch 2/20
 - 2s - loss: 0.0192 - val_loss: 0.1056
Epoch 3/20
 - 2s - loss: 0.0188 - val_loss: 0.1058
Epoch 4/20
 - 2s - loss: 0.0188 - val_loss: 0.1057
Epoch 5/20
 - 2s - loss: 0.0188 - val_loss: 0.1058
Epoch 6/20
 - 2s - loss: 0.0188 - val_loss: 0.1057
Epoch 7/20
 - 2s - loss: 0.0188 - val_loss: 0.1057
Epoch 8/20
 - 2s - loss: 0.0188 - val_loss: 0.1055
Epoch 9/20
 - 2s - loss: 0.0188 - val_loss: 0.1055
Epoch 10/20
 - 2s - loss: 0.0187 - val_loss: 0.1057
Epoch 11/20
 - 2s - loss: 0.0187 - val_loss: 0.1055
Epoch 12/20
 - 2s - loss: 0.0187 - val_loss: 0.1055
Epoch 13/20
 - 2s - loss: 0.0187 - val_loss: 0.1058
Epoch 14/20
 - 2s - loss: 0.0187 - val_loss: 0.1057
Epoch 15/20
 - 2s - loss: 0.0187 - val_loss: 0.1056
Epoch 16/20
 - 2s - loss: 0.0187 - val_loss: 0.1055
Epoch 17/20
 - 2s - loss: 0.0187 - val_loss: 0.1062
Epoch 18/20
 - 2s - loss: 0.0187 - val_loss: 0.1055
Epoch 19/20
 - 2s - loss: 0.0187 - val_loss: 0.1059
Epoch 20/20
 - 2s - loss: 0.0187 - val_loss: 0.1054
Mean Squared Error on the training data: 0.02310
Mean Squared Error on the test data:     0.09162
======================================================================================================
===================
Plot: 25 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_26 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0716 - val_loss: 0.0199
Epoch 2/20
 - 2s - loss: 0.0525 - val_loss: 0.0150
Epoch 3/20
 - 2s - loss: 0.0517 - val_loss: 0.0147
Epoch 4/20
 - 2s - loss: 0.0515 - val_loss: 0.0144
Epoch 5/20
 - 2s - loss: 0.0514 - val_loss: 0.0142
Epoch 6/20
 - 2s - loss: 0.0513 - val_loss: 0.0142
Epoch 7/20
 - 2s - loss: 0.0513 - val_loss: 0.0140
Epoch 8/20
 - 2s - loss: 0.0513 - val_loss: 0.0140
Epoch 9/20
 - 2s - loss: 0.0513 - val_loss: 0.0140
Epoch 10/20
 - 2s - loss: 0.0513 - val_loss: 0.0139
Epoch 11/20
 - 2s - loss: 0.0513 - val_loss: 0.0141
Epoch 12/20
 - 2s - loss: 0.0513 - val_loss: 0.0140
Epoch 13/20
 - 2s - loss: 0.0513 - val_loss: 0.0140
Epoch 14/20
 - 2s - loss: 0.0513 - val_loss: 0.0138
Epoch 15/20
 - 2s - loss: 0.0513 - val_loss: 0.0140
Epoch 16/20
 - 2s - loss: 0.0512 - val_loss: 0.0139
Epoch 17/20
 - 2s - loss: 0.0513 - val_loss: 0.0138
Epoch 18/20
 - 2s - loss: 0.0512 - val_loss: 0.0139
Epoch 19/20
 - 2s - loss: 0.0512 - val_loss: 0.0139
Epoch 20/20
 - 2s - loss: 0.0513 - val_loss: 0.0138
Mean Squared Error on the training data: 0.04934
Mean Squared Error on the test data:     0.00708
======================================================================================================
===================
Plot: 26 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_27 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0211 - val_loss: 0.0596
Epoch 2/20
 - 2s - loss: 0.0156 - val_loss: 0.0579
Epoch 3/20
 - 2s - loss: 0.0154 - val_loss: 0.0578
Epoch 4/20
 - 2s - loss: 0.0152 - val_loss: 0.0577
Epoch 5/20
 - 2s - loss: 0.0152 - val_loss: 0.0576
Epoch 6/20
 - 2s - loss: 0.0152 - val_loss: 0.0576
Epoch 7/20
 - 2s - loss: 0.0152 - val_loss: 0.0578
Epoch 8/20
 - 2s - loss: 0.0151 - val_loss: 0.0576
Epoch 9/20
 - 2s - loss: 0.0151 - val_loss: 0.0575
Epoch 10/20
 - 2s - loss: 0.0151 - val_loss: 0.0576
Epoch 11/20
 - 2s - loss: 0.0151 - val_loss: 0.0578
Epoch 12/20
 - 2s - loss: 0.0151 - val_loss: 0.0578
Epoch 13/20
 - 2s - loss: 0.0151 - val_loss: 0.0575
Epoch 14/20
 - 2s - loss: 0.0151 - val_loss: 0.0580
Epoch 15/20
 - 2s - loss: 0.0151 - val_loss: 0.0577
Epoch 16/20
 - 2s - loss: 0.0151 - val_loss: 0.0586
Epoch 17/20
 - 2s - loss: 0.0151 - val_loss: 0.0577
Epoch 18/20
 - 2s - loss: 0.0151 - val_loss: 0.0575
Epoch 19/20
 - 2s - loss: 0.0151 - val_loss: 0.0575
Epoch 20/20
 - 2s - loss: 0.0151 - val_loss: 0.0575
Mean Squared Error on the training data: 0.01730
Mean Squared Error on the test data:     0.09825
======================================================================================================
===================
Plot: 27 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_28 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0484 - val_loss: 0.0679
Epoch 2/20
 - 2s - loss: 0.0331 - val_loss: 0.0675
Epoch 3/20
 - 2s - loss: 0.0328 - val_loss: 0.0676
Epoch 4/20
 - 2s - loss: 0.0327 - val_loss: 0.0675
Epoch 5/20
 - 2s - loss: 0.0327 - val_loss: 0.0676
Epoch 6/20
 - 2s - loss: 0.0327 - val_loss: 0.0675
Epoch 7/20
 - 2s - loss: 0.0327 - val_loss: 0.0676
Epoch 8/20
 - 2s - loss: 0.0327 - val_loss: 0.0677
Epoch 9/20
 - 2s - loss: 0.0327 - val_loss: 0.0675
Epoch 10/20
 - 2s - loss: 0.0327 - val_loss: 0.0675
Epoch 11/20
 - 2s - loss: 0.0327 - val_loss: 0.0675
Epoch 12/20
 - 2s - loss: 0.0327 - val_loss: 0.0675
Epoch 13/20
 - 2s - loss: 0.0327 - val_loss: 0.0675
Epoch 14/20
 - 2s - loss: 0.0326 - val_loss: 0.0675
Epoch 15/20
 - 2s - loss: 0.0326 - val_loss: 0.0676
Epoch 16/20
 - 2s - loss: 0.0326 - val_loss: 0.0675
Epoch 17/20
 - 2s - loss: 0.0326 - val_loss: 0.0676
Epoch 18/20
 - 2s - loss: 0.0327 - val_loss: 0.0675
Epoch 19/20
 - 2s - loss: 0.0326 - val_loss: 0.0677
Epoch 20/20
 - 2s - loss: 0.0326 - val_loss: 0.0676
Mean Squared Error on the training data: 0.03440
Mean Squared Error on the test data:     0.09761
======================================================================================================
===================
Plot: 28 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_29 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0034 - val_loss: 0.0335
Epoch 2/20
 - 2s - loss: 0.0030 - val_loss: 0.0324
Epoch 3/20
 - 2s - loss: 0.0029 - val_loss: 0.0322
Epoch 4/20
 - 2s - loss: 0.0029 - val_loss: 0.0322
Epoch 5/20
 - 2s - loss: 0.0029 - val_loss: 0.0321
Epoch 6/20
 - 2s - loss: 0.0029 - val_loss: 0.0320
Epoch 7/20
 - 2s - loss: 0.0029 - val_loss: 0.0321
Epoch 8/20
 - 2s - loss: 0.0029 - val_loss: 0.0322
Epoch 9/20
 - 2s - loss: 0.0029 - val_loss: 0.0321
Epoch 10/20
 - 2s - loss: 0.0029 - val_loss: 0.0321
Epoch 11/20
 - 2s - loss: 0.0029 - val_loss: 0.0322
Epoch 12/20
 - 2s - loss: 0.0029 - val_loss: 0.0322
Epoch 13/20
 - 2s - loss: 0.0029 - val_loss: 0.0321
Epoch 14/20
 - 2s - loss: 0.0029 - val_loss: 0.0323
Epoch 15/20
 - 2s - loss: 0.0029 - val_loss: 0.0321
Epoch 16/20
 - 2s - loss: 0.0029 - val_loss: 0.0324
Epoch 17/20
 - 2s - loss: 0.0029 - val_loss: 0.0321
Epoch 18/20
 - 2s - loss: 0.0029 - val_loss: 0.0321
Epoch 19/20
 - 2s - loss: 0.0029 - val_loss: 0.0321
Epoch 20/20
 - 2s - loss: 0.0029 - val_loss: 0.0322
Mean Squared Error on the training data: 0.00436
Mean Squared Error on the test data:     0.13552
======================================================================================================
===================
Plot: 29 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_30 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0949 - val_loss: 0.0657
Epoch 2/20
 - 2s - loss: 0.0418 - val_loss: 0.0632
Epoch 3/20
 - 2s - loss: 0.0411 - val_loss: 0.0624
Epoch 4/20
 - 2s - loss: 0.0406 - val_loss: 0.0620
Epoch 5/20
 - 2s - loss: 0.0404 - val_loss: 0.0617
Epoch 6/20
 - 2s - loss: 0.0404 - val_loss: 0.0615
Epoch 7/20
 - 2s - loss: 0.0404 - val_loss: 0.0620
Epoch 8/20
 - 2s - loss: 0.0403 - val_loss: 0.0617
Epoch 9/20
 - 2s - loss: 0.0403 - val_loss: 0.0617
Epoch 10/20
 - 2s - loss: 0.0403 - val_loss: 0.0615
Epoch 11/20
 - 2s - loss: 0.0403 - val_loss: 0.0618
Epoch 12/20
 - 2s - loss: 0.0403 - val_loss: 0.0615
Epoch 13/20
 - 2s - loss: 0.0403 - val_loss: 0.0614
Epoch 14/20
 - 2s - loss: 0.0403 - val_loss: 0.0615
Epoch 15/20
 - 2s - loss: 0.0403 - val_loss: 0.0614
Epoch 16/20
 - 2s - loss: 0.0403 - val_loss: 0.0615
Epoch 17/20
 - 2s - loss: 0.0403 - val_loss: 0.0615
Epoch 18/20
 - 2s - loss: 0.0403 - val_loss: 0.0613
Epoch 19/20
 - 2s - loss: 0.0403 - val_loss: 0.0618
Epoch 20/20
 - 2s - loss: 0.0402 - val_loss: 0.0614
Mean Squared Error on the training data: 0.04129
Mean Squared Error on the test data:     0.06748
======================================================================================================
===================
Plot: 30 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_31 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 52 samples
Epoch 1/20
 - 2s - loss: 0.0347 - val_loss: 0.0029
Epoch 2/20
 - 2s - loss: 0.0046 - val_loss: 0.0023
Epoch 3/20
 - 2s - loss: 0.0040 - val_loss: 0.0028
Epoch 4/20
 - 2s - loss: 0.0038 - val_loss: 0.0018
Epoch 5/20
 - 2s - loss: 0.0038 - val_loss: 0.0019
Epoch 6/20
 - 2s - loss: 0.0037 - val_loss: 0.0019
Epoch 7/20
 - 2s - loss: 0.0037 - val_loss: 0.0020
Epoch 8/20
 - 2s - loss: 0.0037 - val_loss: 0.0019
Epoch 9/20
 - 2s - loss: 0.0036 - val_loss: 0.0022
Epoch 10/20
 - 2s - loss: 0.0036 - val_loss: 0.0024
Epoch 11/20
 - 2s - loss: 0.0036 - val_loss: 0.0023
Epoch 12/20
 - 2s - loss: 0.0036 - val_loss: 0.0019
Epoch 13/20
 - 2s - loss: 0.0036 - val_loss: 0.0019
Epoch 14/20
 - 2s - loss: 0.0036 - val_loss: 0.0020
Epoch 15/20
 - 2s - loss: 0.0036 - val_loss: 0.0023
Epoch 16/20
 - 2s - loss: 0.0036 - val_loss: 0.0023
Epoch 17/20
 - 2s - loss: 0.0036 - val_loss: 0.0022
Epoch 18/20
 - 2s - loss: 0.0036 - val_loss: 0.0019
Epoch 19/20
 - 2s - loss: 0.0036 - val_loss: 0.0024
Epoch 20/20
 - 2s - loss: 0.0036 - val_loss: 0.0019
Mean Squared Error on the training data: 0.00348
Mean Squared Error on the test data:     0.00238
======================================================================================================
===================
Plot: 31 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_32 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0196 - val_loss: 0.0057
Epoch 2/20
 - 2s - loss: 0.0028 - val_loss: 0.0019
Epoch 3/20
 - 2s - loss: 0.0024 - val_loss: 0.0018
Epoch 4/20
 - 2s - loss: 0.0023 - val_loss: 0.0020
Epoch 5/20
 - 2s - loss: 0.0023 - val_loss: 0.0020
Epoch 6/20
 - 2s - loss: 0.0023 - val_loss: 0.0016
Epoch 7/20
 - 2s - loss: 0.0022 - val_loss: 0.0017
Epoch 8/20
 - 2s - loss: 0.0022 - val_loss: 0.0017
Epoch 9/20
 - 2s - loss: 0.0022 - val_loss: 0.0017
Epoch 10/20
 - 2s - loss: 0.0022 - val_loss: 0.0018
Epoch 11/20
 - 2s - loss: 0.0022 - val_loss: 0.0019
Epoch 12/20
 - 2s - loss: 0.0022 - val_loss: 0.0019
Epoch 13/20
 - 2s - loss: 0.0022 - val_loss: 0.0019
Epoch 14/20
 - 2s - loss: 0.0022 - val_loss: 0.0018
Epoch 15/20
 - 2s - loss: 0.0022 - val_loss: 0.0021
Epoch 16/20
 - 2s - loss: 0.0022 - val_loss: 0.0017
Epoch 17/20
 - 2s - loss: 0.0022 - val_loss: 0.0016
Epoch 18/20
 - 2s - loss: 0.0022 - val_loss: 0.0017
Epoch 19/20
 - 2s - loss: 0.0022 - val_loss: 0.0016
Epoch 20/20
 - 2s - loss: 0.0022 - val_loss: 0.0022
Mean Squared Error on the training data: 0.00217
Mean Squared Error on the test data:     0.00346
======================================================================================================
===================
Plot: 32 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_33 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0700 - val_loss: 0.1076
Epoch 2/20
 - 2s - loss: 0.0444 - val_loss: 0.1061
Epoch 3/20
 - 2s - loss: 0.0441 - val_loss: 0.1069
Epoch 4/20
 - 2s - loss: 0.0440 - val_loss: 0.1054
Epoch 5/20
 - 2s - loss: 0.0440 - val_loss: 0.1065
Epoch 6/20
 - 2s - loss: 0.0439 - val_loss: 0.1066
Epoch 7/20
 - 2s - loss: 0.0439 - val_loss: 0.1063
Epoch 8/20
 - 2s - loss: 0.0439 - val_loss: 0.1073
Epoch 9/20
 - 2s - loss: 0.0439 - val_loss: 0.1064
Epoch 10/20
 - 2s - loss: 0.0439 - val_loss: 0.1067
Epoch 11/20
 - 2s - loss: 0.0439 - val_loss: 0.1067
Epoch 12/20
 - 2s - loss: 0.0439 - val_loss: 0.1062
Epoch 13/20
 - 2s - loss: 0.0439 - val_loss: 0.1069
Epoch 14/20
 - 2s - loss: 0.0438 - val_loss: 0.1066
Epoch 15/20
 - 2s - loss: 0.0439 - val_loss: 0.1067
Epoch 16/20
 - 2s - loss: 0.0439 - val_loss: 0.1075
Epoch 17/20
 - 2s - loss: 0.0439 - val_loss: 0.1064
Epoch 18/20
 - 2s - loss: 0.0438 - val_loss: 0.1075
Epoch 19/20
 - 2s - loss: 0.0438 - val_loss: 0.1060
Epoch 20/20
 - 2s - loss: 0.0439 - val_loss: 0.1064
Mean Squared Error on the training data: 0.04699
Mean Squared Error on the test data:     0.07920
======================================================================================================
===================
Plot: 33 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_34 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0052 - val_loss: 0.0258
Epoch 2/20
 - 2s - loss: 0.0041 - val_loss: 0.0254
Epoch 3/20
 - 2s - loss: 0.0040 - val_loss: 0.0254
Epoch 4/20
 - 2s - loss: 0.0040 - val_loss: 0.0250
Epoch 5/20
 - 2s - loss: 0.0040 - val_loss: 0.0253
Epoch 6/20
 - 2s - loss: 0.0040 - val_loss: 0.0249
Epoch 7/20
 - 2s - loss: 0.0039 - val_loss: 0.0251
Epoch 8/20
 - 2s - loss: 0.0039 - val_loss: 0.0253
Epoch 9/20
 - 2s - loss: 0.0039 - val_loss: 0.0248
Epoch 10/20
 - 2s - loss: 0.0039 - val_loss: 0.0252
Epoch 11/20
 - 2s - loss: 0.0039 - val_loss: 0.0249
Epoch 12/20
 - 2s - loss: 0.0039 - val_loss: 0.0250
Epoch 13/20
 - 2s - loss: 0.0039 - val_loss: 0.0248
Epoch 14/20
 - 2s - loss: 0.0039 - val_loss: 0.0250
Epoch 15/20
 - 2s - loss: 0.0039 - val_loss: 0.0251
Epoch 16/20
 - 2s - loss: 0.0039 - val_loss: 0.0250
Epoch 17/20
 - 2s - loss: 0.0039 - val_loss: 0.0250
Epoch 18/20
 - 2s - loss: 0.0039 - val_loss: 0.0251
Epoch 19/20
 - 2s - loss: 0.0039 - val_loss: 0.0255
Epoch 20/20
 - 2s - loss: 0.0039 - val_loss: 0.0250
Mean Squared Error on the training data: 0.00497
Mean Squared Error on the test data:     0.07148
======================================================================================================
===================
Plot: 34 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_35 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0050 - val_loss: 0.0024
Epoch 2/20
 - 2s - loss: 0.0017 - val_loss: 0.0018
Epoch 3/20
 - 2s - loss: 0.0016 - val_loss: 0.0021
Epoch 4/20
 - 2s - loss: 0.0016 - val_loss: 0.0017
Epoch 5/20
 - 2s - loss: 0.0016 - val_loss: 0.0021
Epoch 6/20
 - 2s - loss: 0.0016 - val_loss: 0.0018
Epoch 7/20
 - 2s - loss: 0.0016 - val_loss: 0.0018
Epoch 8/20
 - 2s - loss: 0.0015 - val_loss: 0.0021
Epoch 9/20
 - 2s - loss: 0.0015 - val_loss: 0.0019
Epoch 10/20
 - 2s - loss: 0.0015 - val_loss: 0.0018
Epoch 11/20
 - 2s - loss: 0.0015 - val_loss: 0.0018
Epoch 12/20
 - 2s - loss: 0.0015 - val_loss: 0.0018
Epoch 13/20
 - 2s - loss: 0.0015 - val_loss: 0.0019
Epoch 14/20
 - 2s - loss: 0.0015 - val_loss: 0.0020
Epoch 15/20
 - 2s - loss: 0.0015 - val_loss: 0.0018
Epoch 16/20
 - 2s - loss: 0.0015 - val_loss: 0.0018
Epoch 17/20
 - 2s - loss: 0.0015 - val_loss: 0.0017
Epoch 18/20
 - 2s - loss: 0.0015 - val_loss: 0.0018
Epoch 19/20
 - 2s - loss: 0.0015 - val_loss: 0.0017
Epoch 20/20
 - 2s - loss: 0.0015 - val_loss: 0.0020
Mean Squared Error on the training data: 0.00156
Mean Squared Error on the test data:     0.00252
======================================================================================================
===================
Plot: 35 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_36 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0467 - val_loss: 0.1259
Epoch 2/20
 - 2s - loss: 0.0302 - val_loss: 0.1226
Epoch 3/20
 - 2s - loss: 0.0299 - val_loss: 0.1223
Epoch 4/20
 - 2s - loss: 0.0298 - val_loss: 0.1236
Epoch 5/20
 - 2s - loss: 0.0297 - val_loss: 0.1234
Epoch 6/20
 - 2s - loss: 0.0297 - val_loss: 0.1220
Epoch 7/20
 - 2s - loss: 0.0296 - val_loss: 0.1221
Epoch 8/20
 - 2s - loss: 0.0296 - val_loss: 0.1225
Epoch 9/20
 - 2s - loss: 0.0296 - val_loss: 0.1228
Epoch 10/20
 - 2s - loss: 0.0296 - val_loss: 0.1212
Epoch 11/20
 - 2s - loss: 0.0296 - val_loss: 0.1221
Epoch 12/20
 - 2s - loss: 0.0296 - val_loss: 0.1215
Epoch 13/20
 - 2s - loss: 0.0296 - val_loss: 0.1215
Epoch 14/20
 - 2s - loss: 0.0296 - val_loss: 0.1211
Epoch 15/20
 - 2s - loss: 0.0296 - val_loss: 0.1215
Epoch 16/20
 - 2s - loss: 0.0296 - val_loss: 0.1220
Epoch 17/20
 - 2s - loss: 0.0296 - val_loss: 0.1216
Epoch 18/20
 - 2s - loss: 0.0296 - val_loss: 0.1216
Epoch 19/20
 - 2s - loss: 0.0296 - val_loss: 0.1228
Epoch 20/20
 - 2s - loss: 0.0296 - val_loss: 0.1221
Mean Squared Error on the training data: 0.03429
Mean Squared Error on the test data:     0.11226
======================================================================================================
===================
Plot: 36 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_37 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0363 - val_loss: 0.0719
Epoch 2/20
 - 2s - loss: 0.0225 - val_loss: 0.0717
Epoch 3/20
 - 2s - loss: 0.0224 - val_loss: 0.0717
Epoch 4/20
 - 2s - loss: 0.0224 - val_loss: 0.0717
Epoch 5/20
 - 2s - loss: 0.0224 - val_loss: 0.0718
Epoch 6/20
 - 2s - loss: 0.0224 - val_loss: 0.0718
Epoch 7/20
 - 2s - loss: 0.0223 - val_loss: 0.0719
Epoch 8/20
 - 2s - loss: 0.0223 - val_loss: 0.0717
Epoch 9/20
 - 2s - loss: 0.0223 - val_loss: 0.0718
Epoch 10/20
 - 2s - loss: 0.0223 - val_loss: 0.0717
Epoch 11/20
 - 2s - loss: 0.0223 - val_loss: 0.0717
Epoch 12/20
 - 2s - loss: 0.0223 - val_loss: 0.0717
Epoch 13/20
 - 2s - loss: 0.0223 - val_loss: 0.0718
Epoch 14/20
 - 2s - loss: 0.0223 - val_loss: 0.0718
Epoch 15/20
 - 2s - loss: 0.0223 - val_loss: 0.0717
Epoch 16/20
 - 2s - loss: 0.0223 - val_loss: 0.0717
Epoch 17/20
 - 2s - loss: 0.0223 - val_loss: 0.0718
Epoch 18/20
 - 2s - loss: 0.0223 - val_loss: 0.0717
Epoch 19/20
 - 2s - loss: 0.0223 - val_loss: 0.0717
Epoch 20/20
 - 2s - loss: 0.0223 - val_loss: 0.0717
Mean Squared Error on the training data: 0.02482
Mean Squared Error on the test data:     0.09924
======================================================================================================
===================
Plot: 37 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_38 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0065 - val_loss: 0.0087
Epoch 2/20
 - 2s - loss: 0.0018 - val_loss: 0.0064
Epoch 3/20
 - 2s - loss: 0.0016 - val_loss: 0.0053
Epoch 4/20
 - 2s - loss: 0.0016 - val_loss: 0.0055
Epoch 5/20
 - 2s - loss: 0.0016 - val_loss: 0.0068
Epoch 6/20
 - 2s - loss: 0.0016 - val_loss: 0.0048
Epoch 7/20
 - 2s - loss: 0.0015 - val_loss: 0.0052
Epoch 8/20
 - 2s - loss: 0.0015 - val_loss: 0.0047
Epoch 9/20
 - 2s - loss: 0.0015 - val_loss: 0.0039
Epoch 10/20
 - 2s - loss: 0.0015 - val_loss: 0.0063
Epoch 11/20
 - 2s - loss: 0.0015 - val_loss: 0.0043
Epoch 12/20
 - 2s - loss: 0.0015 - val_loss: 0.0042
Epoch 13/20
 - 2s - loss: 0.0015 - val_loss: 0.0047
Epoch 14/20
 - 2s - loss: 0.0015 - val_loss: 0.0044
Epoch 15/20
 - 2s - loss: 0.0015 - val_loss: 0.0050
Epoch 16/20
 - 2s - loss: 0.0015 - val_loss: 0.0040
Epoch 17/20
 - 2s - loss: 0.0015 - val_loss: 0.0038
Epoch 18/20
 - 2s - loss: 0.0015 - val_loss: 0.0044
Epoch 19/20
 - 2s - loss: 0.0015 - val_loss: 0.0054
Epoch 20/20
 - 2s - loss: 0.0015 - val_loss: 0.0037
Mean Squared Error on the training data: 0.00161
Mean Squared Error on the test data:     0.00147
======================================================================================================
===================
Plot: 38 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_39 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0282 - val_loss: 0.0671
Epoch 2/20
 - 2s - loss: 0.0174 - val_loss: 0.0662
Epoch 3/20
 - 2s - loss: 0.0172 - val_loss: 0.0661
Epoch 4/20
 - 2s - loss: 0.0172 - val_loss: 0.0660
Epoch 5/20
 - 2s - loss: 0.0171 - val_loss: 0.0661
Epoch 6/20
 - 2s - loss: 0.0171 - val_loss: 0.0659
Epoch 7/20
 - 2s - loss: 0.0171 - val_loss: 0.0663
Epoch 8/20
 - 2s - loss: 0.0171 - val_loss: 0.0658
Epoch 9/20
 - 2s - loss: 0.0171 - val_loss: 0.0657
Epoch 10/20
 - 2s - loss: 0.0171 - val_loss: 0.0660
Epoch 11/20
 - 2s - loss: 0.0171 - val_loss: 0.0657
Epoch 12/20
 - 2s - loss: 0.0171 - val_loss: 0.0656
Epoch 13/20
 - 2s - loss: 0.0171 - val_loss: 0.0657
Epoch 14/20
 - 2s - loss: 0.0170 - val_loss: 0.0658
Epoch 15/20
 - 2s - loss: 0.0171 - val_loss: 0.0658
Epoch 16/20
 - 2s - loss: 0.0171 - val_loss: 0.0657
Epoch 17/20
 - 2s - loss: 0.0171 - val_loss: 0.0662
Epoch 18/20
 - 2s - loss: 0.0171 - val_loss: 0.0657
Epoch 19/20
 - 2s - loss: 0.0170 - val_loss: 0.0657
Epoch 20/20
 - 2s - loss: 0.0170 - val_loss: 0.0656
Mean Squared Error on the training data: 0.01949
Mean Squared Error on the test data:     0.05706
======================================================================================================
===================
Plot: 39 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_40 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0407 - val_loss: 0.0267
Epoch 2/20
 - 2s - loss: 0.0214 - val_loss: 0.0264
Epoch 3/20
 - 2s - loss: 0.0211 - val_loss: 0.0265
Epoch 4/20
 - 2s - loss: 0.0210 - val_loss: 0.0264
Epoch 5/20
 - 2s - loss: 0.0210 - val_loss: 0.0266
Epoch 6/20
 - 2s - loss: 0.0210 - val_loss: 0.0264
Epoch 7/20
 - 2s - loss: 0.0210 - val_loss: 0.0265
Epoch 8/20
 - 2s - loss: 0.0210 - val_loss: 0.0265
Epoch 9/20
 - 2s - loss: 0.0210 - val_loss: 0.0267
Epoch 10/20
 - 2s - loss: 0.0210 - val_loss: 0.0269
Epoch 11/20
 - 2s - loss: 0.0210 - val_loss: 0.0265
Epoch 12/20
 - 2s - loss: 0.0209 - val_loss: 0.0267
Epoch 13/20
 - 2s - loss: 0.0210 - val_loss: 0.0264
Epoch 14/20
 - 2s - loss: 0.0209 - val_loss: 0.0263
Epoch 15/20
 - 2s - loss: 0.0209 - val_loss: 0.0265
Epoch 16/20
 - 2s - loss: 0.0209 - val_loss: 0.0266
Epoch 17/20
 - 2s - loss: 0.0209 - val_loss: 0.0264
Epoch 18/20
 - 2s - loss: 0.0209 - val_loss: 0.0263
Epoch 19/20
 - 2s - loss: 0.0209 - val_loss: 0.0264
Epoch 20/20
 - 2s - loss: 0.0209 - val_loss: 0.0266
Mean Squared Error on the training data: 0.02122
Mean Squared Error on the test data:     0.06384
======================================================================================================
===================
Plot: 40 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_41 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0057 - val_loss: 0.0270
Epoch 2/20
 - 2s - loss: 0.0047 - val_loss: 0.0259
Epoch 3/20
 - 2s - loss: 0.0046 - val_loss: 0.0258
Epoch 4/20
 - 2s - loss: 0.0045 - val_loss: 0.0261
Epoch 5/20
 - 2s - loss: 0.0045 - val_loss: 0.0260
Epoch 6/20
 - 2s - loss: 0.0044 - val_loss: 0.0257
Epoch 7/20
 - 2s - loss: 0.0044 - val_loss: 0.0259
Epoch 8/20
 - 2s - loss: 0.0044 - val_loss: 0.0259
Epoch 9/20
 - 2s - loss: 0.0044 - val_loss: 0.0258
Epoch 10/20
 - 2s - loss: 0.0044 - val_loss: 0.0259
Epoch 11/20
 - 2s - loss: 0.0044 - val_loss: 0.0258
Epoch 12/20
 - 2s - loss: 0.0044 - val_loss: 0.0258
Epoch 13/20
 - 2s - loss: 0.0044 - val_loss: 0.0259
Epoch 14/20
 - 2s - loss: 0.0044 - val_loss: 0.0261
Epoch 15/20
 - 2s - loss: 0.0044 - val_loss: 0.0258
Epoch 16/20
 - 2s - loss: 0.0044 - val_loss: 0.0260
Epoch 17/20
 - 2s - loss: 0.0044 - val_loss: 0.0262
Epoch 18/20
 - 2s - loss: 0.0044 - val_loss: 0.0261
Epoch 19/20
 - 2s - loss: 0.0044 - val_loss: 0.0259
Epoch 20/20
 - 2s - loss: 0.0044 - val_loss: 0.0260
Mean Squared Error on the training data: 0.00549
Mean Squared Error on the test data:     0.08273
======================================================================================================
===================
Plot: 41 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_42 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0484 - val_loss: 0.0532
Epoch 2/20
 - 2s - loss: 0.0296 - val_loss: 0.0528
Epoch 3/20
 - 2s - loss: 0.0293 - val_loss: 0.0526
Epoch 4/20
 - 2s - loss: 0.0293 - val_loss: 0.0526
Epoch 5/20
 - 2s - loss: 0.0293 - val_loss: 0.0528
Epoch 6/20
 - 2s - loss: 0.0292 - val_loss: 0.0526
Epoch 7/20
 - 2s - loss: 0.0292 - val_loss: 0.0527
Epoch 8/20
 - 2s - loss: 0.0292 - val_loss: 0.0527
Epoch 9/20
 - 2s - loss: 0.0292 - val_loss: 0.0529
Epoch 10/20
 - 2s - loss: 0.0292 - val_loss: 0.0533
Epoch 11/20
 - 2s - loss: 0.0292 - val_loss: 0.0528
Epoch 12/20
 - 2s - loss: 0.0292 - val_loss: 0.0525
Epoch 13/20
 - 2s - loss: 0.0292 - val_loss: 0.0526
Epoch 14/20
 - 2s - loss: 0.0291 - val_loss: 0.0526
Epoch 15/20
 - 2s - loss: 0.0291 - val_loss: 0.0525
Epoch 16/20
 - 2s - loss: 0.0291 - val_loss: 0.0525
Epoch 17/20
 - 2s - loss: 0.0291 - val_loss: 0.0527
Epoch 18/20
 - 2s - loss: 0.0291 - val_loss: 0.0525
Epoch 19/20
 - 2s - loss: 0.0291 - val_loss: 0.0529
Epoch 20/20
 - 2s - loss: 0.0291 - val_loss: 0.0525
Mean Squared Error on the training data: 0.03027
Mean Squared Error on the test data:     0.06956
======================================================================================================
===================
Plot: 42 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_43 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0535 - val_loss: 0.0721
Epoch 2/20
 - 2s - loss: 0.0342 - val_loss: 0.0717
Epoch 3/20
 - 2s - loss: 0.0340 - val_loss: 0.0717
Epoch 4/20
 - 2s - loss: 0.0340 - val_loss: 0.0717
Epoch 5/20
 - 2s - loss: 0.0340 - val_loss: 0.0720
Epoch 6/20
 - 2s - loss: 0.0339 - val_loss: 0.0717
Epoch 7/20
 - 2s - loss: 0.0339 - val_loss: 0.0718
Epoch 8/20
 - 2s - loss: 0.0339 - val_loss: 0.0720
Epoch 9/20
 - 2s - loss: 0.0339 - val_loss: 0.0720
Epoch 10/20
 - 2s - loss: 0.0339 - val_loss: 0.0717
Epoch 11/20
 - 2s - loss: 0.0339 - val_loss: 0.0717
Epoch 12/20
 - 2s - loss: 0.0339 - val_loss: 0.0717
Epoch 13/20
 - 2s - loss: 0.0339 - val_loss: 0.0717
Epoch 14/20
 - 2s - loss: 0.0339 - val_loss: 0.0717
Epoch 15/20
 - 2s - loss: 0.0339 - val_loss: 0.0717
Epoch 16/20
 - 2s - loss: 0.0339 - val_loss: 0.0717
Epoch 17/20
 - 2s - loss: 0.0339 - val_loss: 0.0717
Epoch 18/20
 - 2s - loss: 0.0339 - val_loss: 0.0718
Epoch 19/20
 - 2s - loss: 0.0339 - val_loss: 0.0717
Epoch 20/20
 - 2s - loss: 0.0339 - val_loss: 0.0717
Mean Squared Error on the training data: 0.03584
Mean Squared Error on the test data:     0.04874
======================================================================================================
===================
Plot: 43 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_44 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0819 - val_loss: 0.0436
Epoch 2/20
 - 2s - loss: 0.0530 - val_loss: 0.0414
Epoch 3/20
 - 2s - loss: 0.0522 - val_loss: 0.0410
Epoch 4/20
 - 2s - loss: 0.0519 - val_loss: 0.0409
Epoch 5/20
 - 3s - loss: 0.0516 - val_loss: 0.0400
Epoch 6/20
 - 3s - loss: 0.0515 - val_loss: 0.0398
Epoch 7/20
 - 3s - loss: 0.0514 - val_loss: 0.0401
Epoch 8/20
 - 2s - loss: 0.0513 - val_loss: 0.0397
Epoch 9/20
 - 2s - loss: 0.0513 - val_loss: 0.0400
Epoch 10/20
 - 2s - loss: 0.0512 - val_loss: 0.0396
Epoch 11/20
 - 2s - loss: 0.0512 - val_loss: 0.0396
Epoch 12/20
 - 3s - loss: 0.0512 - val_loss: 0.0397
Epoch 13/20
 - 3s - loss: 0.0512 - val_loss: 0.0396
Epoch 14/20
 - 3s - loss: 0.0512 - val_loss: 0.0398
Epoch 15/20
 - 2s - loss: 0.0512 - val_loss: 0.0394
Epoch 16/20
 - 2s - loss: 0.0512 - val_loss: 0.0397
Epoch 17/20
 - 2s - loss: 0.0512 - val_loss: 0.0395
Epoch 18/20
 - 2s - loss: 0.0511 - val_loss: 0.0398
Epoch 19/20
 - 2s - loss: 0.0512 - val_loss: 0.0396
Epoch 20/20
 - 2s - loss: 0.0512 - val_loss: 0.0399
Mean Squared Error on the training data: 0.05064
Mean Squared Error on the test data:     0.03145
======================================================================================================
===================
Plot: 44 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_45 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0070 - val_loss: 0.0374
Epoch 2/20
 - 2s - loss: 0.0064 - val_loss: 0.0373
Epoch 3/20
 - 2s - loss: 0.0064 - val_loss: 0.0370
Epoch 4/20
 - 2s - loss: 0.0064 - val_loss: 0.0371
Epoch 5/20
 - 2s - loss: 0.0064 - val_loss: 0.0372
Epoch 6/20
 - 2s - loss: 0.0064 - val_loss: 0.0370
Epoch 7/20
 - 2s - loss: 0.0064 - val_loss: 0.0371
Epoch 8/20
 - 2s - loss: 0.0064 - val_loss: 0.0371
Epoch 9/20
 - 2s - loss: 0.0064 - val_loss: 0.0371
Epoch 10/20
 - 2s - loss: 0.0064 - val_loss: 0.0374
Epoch 11/20
 - 2s - loss: 0.0064 - val_loss: 0.0373
Epoch 12/20
 - 2s - loss: 0.0064 - val_loss: 0.0371
Epoch 13/20
 - 2s - loss: 0.0064 - val_loss: 0.0371
Epoch 14/20
 - 2s - loss: 0.0064 - val_loss: 0.0371
Epoch 15/20
 - 2s - loss: 0.0064 - val_loss: 0.0371
Epoch 16/20
 - 2s - loss: 0.0064 - val_loss: 0.0370
Epoch 17/20
 - 2s - loss: 0.0064 - val_loss: 0.0371
Epoch 18/20
 - 2s - loss: 0.0064 - val_loss: 0.0371
Epoch 19/20
 - 2s - loss: 0.0064 - val_loss: 0.0370
Epoch 20/20
 - 2s - loss: 0.0064 - val_loss: 0.0370
Mean Squared Error on the training data: 0.00792
Mean Squared Error on the test data:     0.03212
======================================================================================================
===================
Plot: 45 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_46 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 2s - loss: 0.0328 - val_loss: 0.1282
Epoch 2/20
 - 2s - loss: 0.0178 - val_loss: 0.0634
Epoch 3/20
 - 2s - loss: 0.0157 - val_loss: 0.0634
Epoch 4/20
 - 2s - loss: 0.0157 - val_loss: 0.0635
Epoch 5/20
 - 2s - loss: 0.0157 - val_loss: 0.0633
Epoch 6/20
 - 2s - loss: 0.0157 - val_loss: 0.0632
Epoch 7/20
 - 2s - loss: 0.0157 - val_loss: 0.0634
Epoch 8/20
 - 2s - loss: 0.0157 - val_loss: 0.0634
Epoch 9/20
 - 2s - loss: 0.0157 - val_loss: 0.0632
Epoch 10/20
 - 2s - loss: 0.0156 - val_loss: 0.0632
Epoch 11/20
 - 2s - loss: 0.0156 - val_loss: 0.0634
Epoch 12/20
 - 3s - loss: 0.0156 - val_loss: 0.0637
Epoch 13/20
 - 2s - loss: 0.0156 - val_loss: 0.0632
Epoch 14/20
 - 3s - loss: 0.0156 - val_loss: 0.0633
Epoch 15/20
 - 2s - loss: 0.0156 - val_loss: 0.0632
Epoch 16/20
 - 2s - loss: 0.0156 - val_loss: 0.0632
Epoch 17/20
 - 2s - loss: 0.0156 - val_loss: 0.0632
Epoch 18/20
 - 2s - loss: 0.0156 - val_loss: 0.0632
Epoch 19/20
 - 2s - loss: 0.0156 - val_loss: 0.0634
Epoch 20/20
 - 2s - loss: 0.0156 - val_loss: 0.0632
Mean Squared Error on the training data: 0.01804
Mean Squared Error on the test data:     0.09074
======================================================================================================
===================
Plot: 46 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_47 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0711 - val_loss: 0.0341
Epoch 2/20
 - 2s - loss: 0.0370 - val_loss: 0.0338
Epoch 3/20
 - 2s - loss: 0.0364 - val_loss: 0.0336
Epoch 4/20
 - 2s - loss: 0.0363 - val_loss: 0.0335
Epoch 5/20
 - 2s - loss: 0.0363 - val_loss: 0.0336
Epoch 6/20
 - 2s - loss: 0.0363 - val_loss: 0.0335
Epoch 7/20
 - 2s - loss: 0.0363 - val_loss: 0.0336
Epoch 8/20
 - 2s - loss: 0.0362 - val_loss: 0.0335
Epoch 9/20
 - 2s - loss: 0.0362 - val_loss: 0.0336
Epoch 10/20
 - 3s - loss: 0.0362 - val_loss: 0.0336
Epoch 11/20
 - 2s - loss: 0.0362 - val_loss: 0.0335
Epoch 12/20
 - 2s - loss: 0.0362 - val_loss: 0.0338
Epoch 13/20
 - 2s - loss: 0.0362 - val_loss: 0.0335
Epoch 14/20
 - 2s - loss: 0.0362 - val_loss: 0.0336
Epoch 15/20
 - 2s - loss: 0.0362 - val_loss: 0.0335
Epoch 16/20
 - 2s - loss: 0.0362 - val_loss: 0.0336
Epoch 17/20
 - 2s - loss: 0.0362 - val_loss: 0.0336
Epoch 18/20
 - 2s - loss: 0.0362 - val_loss: 0.0335
Epoch 19/20
 - 2s - loss: 0.0362 - val_loss: 0.0336
Epoch 20/20
 - 2s - loss: 0.0362 - val_loss: 0.0335
Mean Squared Error on the training data: 0.03606
Mean Squared Error on the test data:     0.01216
======================================================================================================
===================
Plot: 47 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_48 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0544 - val_loss: 0.2090
Epoch 2/20
 - 2s - loss: 0.0430 - val_loss: 0.1023
Epoch 3/20
 - 2s - loss: 0.0240 - val_loss: 0.1010
Epoch 4/20
 - 2s - loss: 0.0239 - val_loss: 0.1009
Epoch 5/20
 - 2s - loss: 0.0239 - val_loss: 0.1017
Epoch 6/20
 - 3s - loss: 0.0239 - val_loss: 0.1018
Epoch 7/20
 - 2s - loss: 0.0239 - val_loss: 0.1009
Epoch 8/20
 - 2s - loss: 0.0238 - val_loss: 0.1008
Epoch 9/20
 - 2s - loss: 0.0238 - val_loss: 0.1016
Epoch 10/20
 - 2s - loss: 0.0238 - val_loss: 0.1011
Epoch 11/20
 - 2s - loss: 0.0238 - val_loss: 0.1016
Epoch 12/20
 - 2s - loss: 0.0238 - val_loss: 0.1019
Epoch 13/20
 - 2s - loss: 0.0238 - val_loss: 0.1011
Epoch 14/20
 - 3s - loss: 0.0238 - val_loss: 0.1010
Epoch 15/20
 - 2s - loss: 0.0238 - val_loss: 0.1023
Epoch 16/20
 - 2s - loss: 0.0238 - val_loss: 0.1014
Epoch 17/20
 - 3s - loss: 0.0238 - val_loss: 0.1014
Epoch 18/20
 - 2s - loss: 0.0238 - val_loss: 0.1012
Epoch 19/20
 - 2s - loss: 0.0238 - val_loss: 0.1009
Epoch 20/20
 - 3s - loss: 0.0238 - val_loss: 0.1014
Mean Squared Error on the training data: 0.02771
Mean Squared Error on the test data:     0.08319
======================================================================================================
===================
Plot: 48 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_49 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0346 - val_loss: 0.1188
Epoch 2/20
 - 2s - loss: 0.0212 - val_loss: 0.1175
Epoch 3/20
 - 2s - loss: 0.0211 - val_loss: 0.1175
Epoch 4/20
 - 2s - loss: 0.0211 - val_loss: 0.1176
Epoch 5/20
 - 2s - loss: 0.0211 - val_loss: 0.1177
Epoch 6/20
 - 2s - loss: 0.0211 - val_loss: 0.1177
Epoch 7/20
 - 2s - loss: 0.0211 - val_loss: 0.1175
Epoch 8/20
 - 2s - loss: 0.0211 - val_loss: 0.1175
Epoch 9/20
 - 2s - loss: 0.0211 - val_loss: 0.1175
Epoch 10/20
 - 2s - loss: 0.0211 - val_loss: 0.1176
Epoch 11/20
 - 2s - loss: 0.0210 - val_loss: 0.1176
Epoch 12/20
 - 2s - loss: 0.0210 - val_loss: 0.1177
Epoch 13/20
 - 2s - loss: 0.0210 - val_loss: 0.1175
Epoch 14/20
 - 2s - loss: 0.0210 - val_loss: 0.1175
Epoch 15/20
 - 2s - loss: 0.0210 - val_loss: 0.1176
Epoch 16/20
 - 2s - loss: 0.0210 - val_loss: 0.1175
Epoch 17/20
 - 3s - loss: 0.0210 - val_loss: 0.1175
Epoch 18/20
 - 2s - loss: 0.0210 - val_loss: 0.1178
Epoch 19/20
 - 2s - loss: 0.0210 - val_loss: 0.1178
Epoch 20/20
 - 2s - loss: 0.0210 - val_loss: 0.1174
Mean Squared Error on the training data: 0.02593
Mean Squared Error on the test data:     0.05310
======================================================================================================
===================
Plot: 49 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_50 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0349 - val_loss: 0.0837
Epoch 2/20
 - 2s - loss: 0.0193 - val_loss: 0.0801
Epoch 3/20
 - 2s - loss: 0.0191 - val_loss: 0.0800
Epoch 4/20
 - 2s - loss: 0.0190 - val_loss: 0.0797
Epoch 5/20
 - 2s - loss: 0.0189 - val_loss: 0.0798
Epoch 6/20
 - 2s - loss: 0.0189 - val_loss: 0.0797
Epoch 7/20
 - 2s - loss: 0.0189 - val_loss: 0.0797
Epoch 8/20
 - 2s - loss: 0.0189 - val_loss: 0.0801
Epoch 9/20
 - 2s - loss: 0.0189 - val_loss: 0.0799
Epoch 10/20
 - 2s - loss: 0.0188 - val_loss: 0.0797
Epoch 11/20
 - 2s - loss: 0.0189 - val_loss: 0.0800
Epoch 12/20
 - 2s - loss: 0.0188 - val_loss: 0.0799
Epoch 13/20
 - 2s - loss: 0.0188 - val_loss: 0.0796
Epoch 14/20
 - 2s - loss: 0.0188 - val_loss: 0.0797
Epoch 15/20
 - 2s - loss: 0.0188 - val_loss: 0.0799
Epoch 16/20
 - 3s - loss: 0.0188 - val_loss: 0.0797
Epoch 17/20
 - 3s - loss: 0.0188 - val_loss: 0.0807
Epoch 18/20
 - 2s - loss: 0.0188 - val_loss: 0.0806
Epoch 19/20
 - 2s - loss: 0.0188 - val_loss: 0.0798
Epoch 20/20
 - 3s - loss: 0.0188 - val_loss: 0.0799
Mean Squared Error on the training data: 0.02199
Mean Squared Error on the test data:     0.09151
======================================================================================================
===================
Plot: 50 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_51 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 5s - loss: 0.0459 - val_loss: 0.0422
Epoch 2/20
 - 3s - loss: 0.0271 - val_loss: 0.0415
Epoch 3/20
 - 5s - loss: 0.0267 - val_loss: 0.0413
Epoch 4/20
 - 3s - loss: 0.0266 - val_loss: 0.0411
Epoch 5/20
 - 3s - loss: 0.0265 - val_loss: 0.0405
Epoch 6/20
 - 2s - loss: 0.0265 - val_loss: 0.0402
Epoch 7/20
 - 2s - loss: 0.0264 - val_loss: 0.0401
Epoch 8/20
 - 4s - loss: 0.0264 - val_loss: 0.0398
Epoch 9/20
 - 4s - loss: 0.0263 - val_loss: 0.0399
Epoch 10/20
 - 3s - loss: 0.0263 - val_loss: 0.0400
Epoch 11/20
 - 2s - loss: 0.0263 - val_loss: 0.0400
Epoch 12/20
 - 2s - loss: 0.0263 - val_loss: 0.0398
Epoch 13/20
 - 2s - loss: 0.0263 - val_loss: 0.0396
Epoch 14/20
 - 2s - loss: 0.0263 - val_loss: 0.0397
Epoch 15/20
 - 2s - loss: 0.0263 - val_loss: 0.0396
Epoch 16/20
 - 2s - loss: 0.0263 - val_loss: 0.0398
Epoch 17/20
 - 2s - loss: 0.0263 - val_loss: 0.0394
Epoch 18/20
 - 2s - loss: 0.0263 - val_loss: 0.0395
Epoch 19/20
 - 2s - loss: 0.0263 - val_loss: 0.0398
Epoch 20/20
 - 2s - loss: 0.0263 - val_loss: 0.0396
Mean Squared Error on the training data: 0.02691
Mean Squared Error on the test data:     0.08747
======================================================================================================
===================
Plot: 51 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_52 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0111 - val_loss: 0.0234
Epoch 2/20
 - 2s - loss: 0.0084 - val_loss: 0.0225
Epoch 3/20
 - 3s - loss: 0.0083 - val_loss: 0.0220
Epoch 4/20
 - 3s - loss: 0.0082 - val_loss: 0.0220
Epoch 5/20
 - 3s - loss: 0.0082 - val_loss: 0.0216
Epoch 6/20
 - 3s - loss: 0.0082 - val_loss: 0.0215
Epoch 7/20
 - 2s - loss: 0.0081 - val_loss: 0.0217
Epoch 8/20
 - 2s - loss: 0.0081 - val_loss: 0.0214
Epoch 9/20
 - 2s - loss: 0.0081 - val_loss: 0.0214
Epoch 10/20
 - 2s - loss: 0.0081 - val_loss: 0.0215
Epoch 11/20
 - 2s - loss: 0.0081 - val_loss: 0.0216
Epoch 12/20
 - 2s - loss: 0.0081 - val_loss: 0.0217
Epoch 13/20
 - 2s - loss: 0.0081 - val_loss: 0.0214
Epoch 14/20
 - 2s - loss: 0.0081 - val_loss: 0.0216
Epoch 15/20
 - 2s - loss: 0.0081 - val_loss: 0.0215
Epoch 16/20
 - 2s - loss: 0.0081 - val_loss: 0.0213
Epoch 17/20
 - 2s - loss: 0.0081 - val_loss: 0.0214
Epoch 18/20
 - 2s - loss: 0.0081 - val_loss: 0.0216
Epoch 19/20
 - 2s - loss: 0.0081 - val_loss: 0.0214
Epoch 20/20
 - 2s - loss: 0.0081 - val_loss: 0.0215
Mean Squared Error on the training data: 0.00880
Mean Squared Error on the test data:     0.06315
======================================================================================================
===================
Plot: 52 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_53 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0134 - val_loss: 0.0024
Epoch 2/20
 - 2s - loss: 0.0034 - val_loss: 0.0015
Epoch 3/20
 - 2s - loss: 0.0032 - val_loss: 0.0017
Epoch 4/20
 - 2s - loss: 0.0031 - val_loss: 0.0021
Epoch 5/20
 - 2s - loss: 0.0030 - val_loss: 0.0016
Epoch 6/20
 - 2s - loss: 0.0030 - val_loss: 0.0016
Epoch 7/20
 - 2s - loss: 0.0029 - val_loss: 0.0014
Epoch 8/20
 - 2s - loss: 0.0029 - val_loss: 0.0016
Epoch 9/20
 - 2s - loss: 0.0029 - val_loss: 0.0020
Epoch 10/20
 - 2s - loss: 0.0029 - val_loss: 0.0015
Epoch 11/20
 - 2s - loss: 0.0029 - val_loss: 0.0019
Epoch 12/20
 - 2s - loss: 0.0029 - val_loss: 0.0020
Epoch 13/20
 - 2s - loss: 0.0029 - val_loss: 0.0021
Epoch 14/20
 - 2s - loss: 0.0029 - val_loss: 0.0017
Epoch 15/20
 - 2s - loss: 0.0029 - val_loss: 0.0025
Epoch 16/20
 - 2s - loss: 0.0029 - val_loss: 0.0019
Epoch 17/20
 - 2s - loss: 0.0029 - val_loss: 0.0014
Epoch 18/20
 - 2s - loss: 0.0029 - val_loss: 0.0019
Epoch 19/20
 - 2s - loss: 0.0029 - val_loss: 0.0019
Epoch 20/20
 - 2s - loss: 0.0029 - val_loss: 0.0016
Mean Squared Error on the training data: 0.00282
Mean Squared Error on the test data:     0.00273
======================================================================================================
===================
Plot: 53 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_54 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0904 - val_loss: 0.0515
Epoch 2/20
 - 2s - loss: 0.0822 - val_loss: 0.0512
Epoch 3/20
 - 2s - loss: 0.0820 - val_loss: 0.0509
Epoch 4/20
 - 2s - loss: 0.0819 - val_loss: 0.0507
Epoch 5/20
 - 2s - loss: 0.0819 - val_loss: 0.0508
Epoch 6/20
 - 2s - loss: 0.0819 - val_loss: 0.0507
Epoch 7/20
 - 2s - loss: 0.0818 - val_loss: 0.0507
Epoch 8/20
 - 2s - loss: 0.0818 - val_loss: 0.0508
Epoch 9/20
 - 2s - loss: 0.0818 - val_loss: 0.0507
Epoch 10/20
 - 2s - loss: 0.0818 - val_loss: 0.0508
Epoch 11/20
 - 2s - loss: 0.0818 - val_loss: 0.0509
Epoch 12/20
 - 2s - loss: 0.0818 - val_loss: 0.0513
Epoch 13/20
 - 2s - loss: 0.0818 - val_loss: 0.0507
Epoch 14/20
 - 2s - loss: 0.0818 - val_loss: 0.0507
Epoch 15/20
 - 2s - loss: 0.0818 - val_loss: 0.0506
Epoch 16/20
 - 2s - loss: 0.0818 - val_loss: 0.0506
Epoch 17/20
 - 2s - loss: 0.0818 - val_loss: 0.0506
Epoch 18/20
 - 2s - loss: 0.0818 - val_loss: 0.0506
Epoch 19/20
 - 2s - loss: 0.0818 - val_loss: 0.0507
Epoch 20/20
 - 2s - loss: 0.0818 - val_loss: 0.0506
Mean Squared Error on the training data: 0.08023
Mean Squared Error on the test data:     0.04321
======================================================================================================
===================
Plot: 54 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_55 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0377 - val_loss: 0.1190
Epoch 2/20
 - 2s - loss: 0.0274 - val_loss: 0.1173
Epoch 3/20
 - 2s - loss: 0.0270 - val_loss: 0.1170
Epoch 4/20
 - 2s - loss: 0.0269 - val_loss: 0.1170
Epoch 5/20
 - 2s - loss: 0.0268 - val_loss: 0.1181
Epoch 6/20
 - 2s - loss: 0.0268 - val_loss: 0.1172
Epoch 7/20
 - 2s - loss: 0.0268 - val_loss: 0.1172
Epoch 8/20
 - 2s - loss: 0.0268 - val_loss: 0.1175
Epoch 9/20
 - 2s - loss: 0.0268 - val_loss: 0.1172
Epoch 10/20
 - 2s - loss: 0.0267 - val_loss: 0.1177
Epoch 11/20
 - 2s - loss: 0.0268 - val_loss: 0.1175
Epoch 12/20
 - 2s - loss: 0.0267 - val_loss: 0.1171
Epoch 13/20
 - 2s - loss: 0.0267 - val_loss: 0.1181
Epoch 14/20
 - 2s - loss: 0.0268 - val_loss: 0.1178
Epoch 15/20
 - 2s - loss: 0.0267 - val_loss: 0.1172
Epoch 16/20
 - 2s - loss: 0.0267 - val_loss: 0.1170
Epoch 17/20
 - 2s - loss: 0.0267 - val_loss: 0.1170
Epoch 18/20
 - 2s - loss: 0.0267 - val_loss: 0.1168
Epoch 19/20
 - 2s - loss: 0.0267 - val_loss: 0.1169
Epoch 20/20
 - 2s - loss: 0.0267 - val_loss: 0.1173
Mean Squared Error on the training data: 0.03130
Mean Squared Error on the test data:     0.10839
======================================================================================================
===================
Plot: 55 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_56 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0503 - val_loss: 0.0031
Epoch 2/20
 - 2s - loss: 0.0358 - val_loss: 0.0022
Epoch 3/20
 - 2s - loss: 0.0356 - val_loss: 0.0021
Epoch 4/20
 - 2s - loss: 0.0355 - val_loss: 0.0021
Epoch 5/20
 - 2s - loss: 0.0355 - val_loss: 0.0020
Epoch 6/20
 - 2s - loss: 0.0354 - val_loss: 0.0020
Epoch 7/20
 - 2s - loss: 0.0354 - val_loss: 0.0020
Epoch 8/20
 - 2s - loss: 0.0354 - val_loss: 0.0021
Epoch 9/20
 - 2s - loss: 0.0354 - val_loss: 0.0021
Epoch 10/20
 - 2s - loss: 0.0354 - val_loss: 0.0020
Epoch 11/20
 - 2s - loss: 0.0354 - val_loss: 0.0020
Epoch 12/20
 - 3s - loss: 0.0354 - val_loss: 0.0021
Epoch 13/20
 - 3s - loss: 0.0354 - val_loss: 0.0020
Epoch 14/20
 - 3s - loss: 0.0354 - val_loss: 0.0020
Epoch 15/20
 - 3s - loss: 0.0354 - val_loss: 0.0019
Epoch 16/20
 - 2s - loss: 0.0354 - val_loss: 0.0020
Epoch 17/20
 - 3s - loss: 0.0354 - val_loss: 0.0020
Epoch 18/20
 - 3s - loss: 0.0354 - val_loss: 0.0019
Epoch 19/20
 - 3s - loss: 0.0354 - val_loss: 0.0019
Epoch 20/20
 - 3s - loss: 0.0354 - val_loss: 0.0020
Mean Squared Error on the training data: 0.03366
Mean Squared Error on the test data:     0.00243
======================================================================================================
===================
Plot: 56 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_57 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 4s - loss: 0.0073 - val_loss: 0.0030
Epoch 2/20
 - 3s - loss: 0.0014 - val_loss: 0.0016
Epoch 3/20
 - 3s - loss: 0.0013 - val_loss: 0.0017
Epoch 4/20
 - 2s - loss: 0.0013 - val_loss: 0.0017
Epoch 5/20
 - 2s - loss: 0.0013 - val_loss: 0.0024
Epoch 6/20
 - 2s - loss: 0.0013 - val_loss: 0.0018
Epoch 7/20
 - 2s - loss: 0.0013 - val_loss: 0.0027
Epoch 8/20
 - 2s - loss: 0.0013 - val_loss: 0.0024
Epoch 9/20
 - 2s - loss: 0.0013 - val_loss: 0.0019
Epoch 10/20
 - 2s - loss: 0.0013 - val_loss: 0.0026
Epoch 11/20
 - 2s - loss: 0.0013 - val_loss: 0.0043
Epoch 12/20
 - 2s - loss: 0.0013 - val_loss: 0.0024
Epoch 13/20
 - 2s - loss: 0.0013 - val_loss: 0.0020
Epoch 14/20
 - 2s - loss: 0.0013 - val_loss: 0.0022
Epoch 15/20
 - 2s - loss: 0.0013 - val_loss: 0.0026
Epoch 16/20
 - 2s - loss: 0.0013 - val_loss: 0.0018
Epoch 17/20
 - 2s - loss: 0.0013 - val_loss: 0.0018
Epoch 18/20
 - 2s - loss: 0.0013 - val_loss: 0.0019
Epoch 19/20
 - 2s - loss: 0.0013 - val_loss: 0.0019
Epoch 20/20
 - 2s - loss: 0.0013 - val_loss: 0.0030
Mean Squared Error on the training data: 0.00138
Mean Squared Error on the test data:     0.00235
======================================================================================================
===================
Plot: 57 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_58 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0574 - val_loss: 0.0358
Epoch 2/20
 - 2s - loss: 0.0354 - val_loss: 0.0355
Epoch 3/20
 - 2s - loss: 0.0350 - val_loss: 0.0352
Epoch 4/20
 - 2s - loss: 0.0349 - val_loss: 0.0351
Epoch 5/20
 - 2s - loss: 0.0348 - val_loss: 0.0350
Epoch 6/20
 - 2s - loss: 0.0348 - val_loss: 0.0349
Epoch 7/20
 - 2s - loss: 0.0348 - val_loss: 0.0349
Epoch 8/20
 - 2s - loss: 0.0348 - val_loss: 0.0350
Epoch 9/20
 - 2s - loss: 0.0348 - val_loss: 0.0350
Epoch 10/20
 - 2s - loss: 0.0348 - val_loss: 0.0349
Epoch 11/20
 - 2s - loss: 0.0348 - val_loss: 0.0349
Epoch 12/20
 - 2s - loss: 0.0348 - val_loss: 0.0349
Epoch 13/20
 - 2s - loss: 0.0348 - val_loss: 0.0349
Epoch 14/20
 - 2s - loss: 0.0348 - val_loss: 0.0349
Epoch 15/20
 - 2s - loss: 0.0348 - val_loss: 0.0349
Epoch 16/20
 - 2s - loss: 0.0348 - val_loss: 0.0349
Epoch 17/20
 - 2s - loss: 0.0348 - val_loss: 0.0349
Epoch 18/20
 - 2s - loss: 0.0348 - val_loss: 0.0349
Epoch 19/20
 - 2s - loss: 0.0348 - val_loss: 0.0351
Epoch 20/20
 - 2s - loss: 0.0348 - val_loss: 0.0349
Mean Squared Error on the training data: 0.03475
Mean Squared Error on the test data:     0.01200
======================================================================================================
===================
Plot: 58 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_59 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0586 - val_loss: 0.0086
Epoch 2/20
 - 2s - loss: 0.0439 - val_loss: 0.0075
Epoch 3/20
 - 2s - loss: 0.0436 - val_loss: 0.0075
Epoch 4/20
 - 2s - loss: 0.0435 - val_loss: 0.0072
Epoch 5/20
 - 2s - loss: 0.0434 - val_loss: 0.0071
Epoch 6/20
 - 2s - loss: 0.0434 - val_loss: 0.0071
Epoch 7/20
 - 2s - loss: 0.0433 - val_loss: 0.0072
Epoch 8/20
 - 2s - loss: 0.0433 - val_loss: 0.0074
Epoch 9/20
 - 2s - loss: 0.0433 - val_loss: 0.0071
Epoch 10/20
 - 2s - loss: 0.0433 - val_loss: 0.0071
Epoch 11/20
 - 2s - loss: 0.0433 - val_loss: 0.0071
Epoch 12/20
 - 2s - loss: 0.0433 - val_loss: 0.0071
Epoch 13/20
 - 2s - loss: 0.0433 - val_loss: 0.0073
Epoch 14/20
 - 2s - loss: 0.0433 - val_loss: 0.0071
Epoch 15/20
 - 2s - loss: 0.0433 - val_loss: 0.0071
Epoch 16/20
 - 2s - loss: 0.0433 - val_loss: 0.0072
Epoch 17/20
 - 2s - loss: 0.0433 - val_loss: 0.0071
Epoch 18/20
 - 2s - loss: 0.0433 - val_loss: 0.0071
Epoch 19/20
 - 2s - loss: 0.0433 - val_loss: 0.0071
Epoch 20/20
 - 2s - loss: 0.0433 - val_loss: 0.0071
Mean Squared Error on the training data: 0.04148
Mean Squared Error on the test data:     0.02907
======================================================================================================
===================
Plot: 59 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_60 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0210 - val_loss: 0.0796
Epoch 2/20
 - 2s - loss: 0.0151 - val_loss: 0.0774
Epoch 3/20
 - 2s - loss: 0.0150 - val_loss: 0.0766
Epoch 4/20
 - 2s - loss: 0.0149 - val_loss: 0.0769
Epoch 5/20
 - 2s - loss: 0.0149 - val_loss: 0.0767
Epoch 6/20
 - 2s - loss: 0.0149 - val_loss: 0.0754
Epoch 7/20
 - 2s - loss: 0.0149 - val_loss: 0.0753
Epoch 8/20
 - 2s - loss: 0.0149 - val_loss: 0.0757
Epoch 9/20
 - 2s - loss: 0.0149 - val_loss: 0.0753
Epoch 10/20
 - 2s - loss: 0.0149 - val_loss: 0.0753
Epoch 11/20
 - 2s - loss: 0.0149 - val_loss: 0.0754
Epoch 12/20
 - 2s - loss: 0.0149 - val_loss: 0.0754
Epoch 13/20
 - 2s - loss: 0.0149 - val_loss: 0.0757
Epoch 14/20
 - 2s - loss: 0.0149 - val_loss: 0.0755
Epoch 15/20
 - 2s - loss: 0.0149 - val_loss: 0.0753
Epoch 16/20
 - 2s - loss: 0.0149 - val_loss: 0.0756
Epoch 17/20
 - 2s - loss: 0.0149 - val_loss: 0.0752
Epoch 18/20
 - 2s - loss: 0.0149 - val_loss: 0.0753
Epoch 19/20
 - 2s - loss: 0.0149 - val_loss: 0.0754
Epoch 20/20
 - 2s - loss: 0.0149 - val_loss: 0.0753
Mean Squared Error on the training data: 0.01793
Mean Squared Error on the test data:     0.07487
======================================================================================================
===================
Plot: 60 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_61 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0713 - val_loss: 0.0485
Epoch 2/20
 - 2s - loss: 0.0453 - val_loss: 0.0485
Epoch 3/20
 - 2s - loss: 0.0449 - val_loss: 0.0479
Epoch 4/20
 - 2s - loss: 0.0448 - val_loss: 0.0483
Epoch 5/20
 - 2s - loss: 0.0447 - val_loss: 0.0483
Epoch 6/20
 - 2s - loss: 0.0447 - val_loss: 0.0498
Epoch 7/20
 - 2s - loss: 0.0446 - val_loss: 0.0481
Epoch 8/20
 - 2s - loss: 0.0446 - val_loss: 0.0487
Epoch 9/20
 - 2s - loss: 0.0446 - val_loss: 0.0487
Epoch 10/20
 - 2s - loss: 0.0446 - val_loss: 0.0486
Epoch 11/20
 - 2s - loss: 0.0446 - val_loss: 0.0487
Epoch 12/20
 - 2s - loss: 0.0446 - val_loss: 0.0493
Epoch 13/20
 - 2s - loss: 0.0446 - val_loss: 0.0485
Epoch 14/20
 - 2s - loss: 0.0446 - val_loss: 0.0483
Epoch 15/20
 - 2s - loss: 0.0446 - val_loss: 0.0489
Epoch 16/20
 - 2s - loss: 0.0446 - val_loss: 0.0488
Epoch 17/20
 - 2s - loss: 0.0446 - val_loss: 0.0482
Epoch 18/20
 - 2s - loss: 0.0446 - val_loss: 0.0490
Epoch 19/20
 - 2s - loss: 0.0446 - val_loss: 0.0482
Epoch 20/20
 - 2s - loss: 0.0446 - val_loss: 0.0486
Mean Squared Error on the training data: 0.04472
Mean Squared Error on the test data:     0.02097
======================================================================================================
===================
Plot: 61 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_62 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0524 - val_loss: 0.0171
Epoch 2/20
 - 2s - loss: 0.0320 - val_loss: 0.0169
Epoch 3/20
 - 2s - loss: 0.0317 - val_loss: 0.0170
Epoch 4/20
 - 2s - loss: 0.0317 - val_loss: 0.0168
Epoch 5/20
 - 2s - loss: 0.0317 - val_loss: 0.0169
Epoch 6/20
 - 2s - loss: 0.0317 - val_loss: 0.0169
Epoch 7/20
 - 2s - loss: 0.0317 - val_loss: 0.0168
Epoch 8/20
 - 2s - loss: 0.0317 - val_loss: 0.0168
Epoch 9/20
 - 2s - loss: 0.0317 - val_loss: 0.0168
Epoch 10/20
 - 2s - loss: 0.0316 - val_loss: 0.0168
Epoch 11/20
 - 2s - loss: 0.0316 - val_loss: 0.0168
Epoch 12/20
 - 2s - loss: 0.0316 - val_loss: 0.0168
Epoch 13/20
 - 2s - loss: 0.0316 - val_loss: 0.0169
Epoch 14/20
 - 2s - loss: 0.0316 - val_loss: 0.0168
Epoch 15/20
 - 2s - loss: 0.0316 - val_loss: 0.0170
Epoch 16/20
 - 2s - loss: 0.0316 - val_loss: 0.0168
Epoch 17/20
 - 2s - loss: 0.0316 - val_loss: 0.0168
Epoch 18/20
 - 2s - loss: 0.0316 - val_loss: 0.0168
Epoch 19/20
 - 2s - loss: 0.0316 - val_loss: 0.0169
Epoch 20/20
 - 2s - loss: 0.0316 - val_loss: 0.0169
Mean Squared Error on the training data: 0.03080
Mean Squared Error on the test data:     0.05830
======================================================================================================
===================
Plot: 62 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_63 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0149 - val_loss: 0.0524
Epoch 2/20
 - 2s - loss: 0.0104 - val_loss: 0.0522
Epoch 3/20
 - 2s - loss: 0.0103 - val_loss: 0.0522
Epoch 4/20
 - 2s - loss: 0.0103 - val_loss: 0.0523
Epoch 5/20
 - 2s - loss: 0.0102 - val_loss: 0.0522
Epoch 6/20
 - 2s - loss: 0.0102 - val_loss: 0.0521
Epoch 7/20
 - 2s - loss: 0.0102 - val_loss: 0.0522
Epoch 8/20
 - 2s - loss: 0.0102 - val_loss: 0.0522
Epoch 9/20
 - 2s - loss: 0.0102 - val_loss: 0.0521
Epoch 10/20
 - 2s - loss: 0.0102 - val_loss: 0.0523
Epoch 11/20
 - 2s - loss: 0.0102 - val_loss: 0.0521
Epoch 12/20
 - 2s - loss: 0.0102 - val_loss: 0.0522
Epoch 13/20
 - 2s - loss: 0.0102 - val_loss: 0.0523
Epoch 14/20
 - 2s - loss: 0.0102 - val_loss: 0.0522
Epoch 15/20
 - 2s - loss: 0.0102 - val_loss: 0.0521
Epoch 16/20
 - 2s - loss: 0.0102 - val_loss: 0.0523
Epoch 17/20
 - 2s - loss: 0.0102 - val_loss: 0.0525
Epoch 18/20
 - 2s - loss: 0.0102 - val_loss: 0.0522
Epoch 19/20
 - 2s - loss: 0.0102 - val_loss: 0.0525
Epoch 20/20
 - 2s - loss: 0.0101 - val_loss: 0.0522
Mean Squared Error on the training data: 0.01229
Mean Squared Error on the test data:     0.08310
======================================================================================================
===================
Plot: 63 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_64 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0457 - val_loss: 0.0493
Epoch 2/20
 - 2s - loss: 0.0325 - val_loss: 0.0494
Epoch 3/20
 - 2s - loss: 0.0322 - val_loss: 0.0489
Epoch 4/20
 - 2s - loss: 0.0321 - val_loss: 0.0488
Epoch 5/20
 - 2s - loss: 0.0320 - val_loss: 0.0493
Epoch 6/20
 - 2s - loss: 0.0320 - val_loss: 0.0489
Epoch 7/20
 - 2s - loss: 0.0320 - val_loss: 0.0487
Epoch 8/20
 - 2s - loss: 0.0320 - val_loss: 0.0490
Epoch 9/20
 - 2s - loss: 0.0320 - val_loss: 0.0487
Epoch 10/20
 - 2s - loss: 0.0320 - val_loss: 0.0493
Epoch 11/20
 - 2s - loss: 0.0320 - val_loss: 0.0494
Epoch 12/20
 - 2s - loss: 0.0320 - val_loss: 0.0487
Epoch 13/20
 - 2s - loss: 0.0319 - val_loss: 0.0491
Epoch 14/20
 - 2s - loss: 0.0320 - val_loss: 0.0487
Epoch 15/20
 - 2s - loss: 0.0320 - val_loss: 0.0493
Epoch 16/20
 - 2s - loss: 0.0319 - val_loss: 0.0488
Epoch 17/20
 - 2s - loss: 0.0319 - val_loss: 0.0487
Epoch 18/20
 - 2s - loss: 0.0320 - val_loss: 0.0488
Epoch 19/20
 - 2s - loss: 0.0319 - val_loss: 0.0491
Epoch 20/20
 - 2s - loss: 0.0319 - val_loss: 0.0488
Mean Squared Error on the training data: 0.03277
Mean Squared Error on the test data:     0.02116
======================================================================================================
===================
Plot: 64 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_65 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0724 - val_loss: 0.1349
Epoch 2/20
 - 2s - loss: 0.0649 - val_loss: 0.1349
Epoch 3/20
 - 2s - loss: 0.0648 - val_loss: 0.1348
Epoch 4/20
 - 2s - loss: 0.0647 - val_loss: 0.1347
Epoch 5/20
 - 2s - loss: 0.0647 - val_loss: 0.1347
Epoch 6/20
 - 2s - loss: 0.0646 - val_loss: 0.1346
Epoch 7/20
 - 2s - loss: 0.0646 - val_loss: 0.1347
Epoch 8/20
 - 2s - loss: 0.0646 - val_loss: 0.1347
Epoch 9/20
 - 2s - loss: 0.0646 - val_loss: 0.1347
Epoch 10/20
 - 2s - loss: 0.0646 - val_loss: 0.1346
Epoch 11/20
 - 2s - loss: 0.0646 - val_loss: 0.1348
Epoch 12/20
 - 2s - loss: 0.0646 - val_loss: 0.1346
Epoch 13/20
 - 2s - loss: 0.0646 - val_loss: 0.1346
Epoch 14/20
 - 2s - loss: 0.0646 - val_loss: 0.1347
Epoch 15/20
 - 2s - loss: 0.0646 - val_loss: 0.1347
Epoch 16/20
 - 2s - loss: 0.0646 - val_loss: 0.1347
Epoch 17/20
 - 2s - loss: 0.0646 - val_loss: 0.1347
Epoch 18/20
 - 2s - loss: 0.0646 - val_loss: 0.1347
Epoch 19/20
 - 2s - loss: 0.0646 - val_loss: 0.1346
Epoch 20/20
 - 2s - loss: 0.0646 - val_loss: 0.1346
Mean Squared Error on the training data: 0.06816
Mean Squared Error on the test data:     0.06801
======================================================================================================
===================
Plot: 65 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_66 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0153 - val_loss: 0.0493
Epoch 2/20
 - 2s - loss: 0.0109 - val_loss: 0.0471
Epoch 3/20
 - 2s - loss: 0.0107 - val_loss: 0.0470
Epoch 4/20
 - 2s - loss: 0.0106 - val_loss: 0.0472
Epoch 5/20
 - 2s - loss: 0.0106 - val_loss: 0.0466
Epoch 6/20
 - 2s - loss: 0.0106 - val_loss: 0.0463
Epoch 7/20
 - 2s - loss: 0.0106 - val_loss: 0.0464
Epoch 8/20
 - 2s - loss: 0.0105 - val_loss: 0.0462
Epoch 9/20
 - 2s - loss: 0.0105 - val_loss: 0.0461
Epoch 10/20
 - 2s - loss: 0.0105 - val_loss: 0.0467
Epoch 11/20
 - 2s - loss: 0.0105 - val_loss: 0.0461
Epoch 12/20
 - 2s - loss: 0.0105 - val_loss: 0.0462
Epoch 13/20
 - 2s - loss: 0.0105 - val_loss: 0.0466
Epoch 14/20
 - 2s - loss: 0.0105 - val_loss: 0.0463
Epoch 15/20
 - 2s - loss: 0.0105 - val_loss: 0.0462
Epoch 16/20
 - 2s - loss: 0.0105 - val_loss: 0.0462
Epoch 17/20
 - 2s - loss: 0.0105 - val_loss: 0.0462
Epoch 18/20
 - 2s - loss: 0.0105 - val_loss: 0.0463
Epoch 19/20
 - 2s - loss: 0.0105 - val_loss: 0.0462
Epoch 20/20
 - 2s - loss: 0.0105 - val_loss: 0.0463
Mean Squared Error on the training data: 0.01229
Mean Squared Error on the test data:     0.09103
======================================================================================================
===================
Plot: 66 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_67 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0301 - val_loss: 0.0486
Epoch 2/20
 - 2s - loss: 0.0215 - val_loss: 0.0477
Epoch 3/20
 - 2s - loss: 0.0211 - val_loss: 0.0479
Epoch 4/20
 - 2s - loss: 0.0210 - val_loss: 0.0475
Epoch 5/20
 - 2s - loss: 0.0210 - val_loss: 0.0487
Epoch 6/20
 - 2s - loss: 0.0210 - val_loss: 0.0474
Epoch 7/20
 - 2s - loss: 0.0209 - val_loss: 0.0475
Epoch 8/20
 - 2s - loss: 0.0209 - val_loss: 0.0473
Epoch 9/20
 - 2s - loss: 0.0209 - val_loss: 0.0472
Epoch 10/20
 - 2s - loss: 0.0209 - val_loss: 0.0471
Epoch 11/20
 - 2s - loss: 0.0209 - val_loss: 0.0473
Epoch 12/20
 - 2s - loss: 0.0209 - val_loss: 0.0472
Epoch 13/20
 - 2s - loss: 0.0209 - val_loss: 0.0471
Epoch 14/20
 - 2s - loss: 0.0209 - val_loss: 0.0472
Epoch 15/20
 - 2s - loss: 0.0209 - val_loss: 0.0471
Epoch 16/20
 - 2s - loss: 0.0208 - val_loss: 0.0475
Epoch 17/20
 - 2s - loss: 0.0208 - val_loss: 0.0470
Epoch 18/20
 - 2s - loss: 0.0209 - val_loss: 0.0470
Epoch 19/20
 - 2s - loss: 0.0208 - val_loss: 0.0470
Epoch 20/20
 - 2s - loss: 0.0208 - val_loss: 0.0475
Mean Squared Error on the training data: 0.02217
Mean Squared Error on the test data:     0.07852
======================================================================================================
===================
Plot: 67 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_68 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0566 - val_loss: 0.0839
Epoch 2/20
 - 2s - loss: 0.0382 - val_loss: 0.0835
Epoch 3/20
 - 2s - loss: 0.0379 - val_loss: 0.0834
Epoch 4/20
 - 2s - loss: 0.0378 - val_loss: 0.0841
Epoch 5/20
 - 2s - loss: 0.0377 - val_loss: 0.0837
Epoch 6/20
 - 2s - loss: 0.0377 - val_loss: 0.0837
Epoch 7/20
 - 2s - loss: 0.0377 - val_loss: 0.0837
Epoch 8/20
 - 2s - loss: 0.0377 - val_loss: 0.0834
Epoch 9/20
 - 2s - loss: 0.0377 - val_loss: 0.0838
Epoch 10/20
 - 2s - loss: 0.0377 - val_loss: 0.0837
Epoch 11/20
 - 2s - loss: 0.0376 - val_loss: 0.0832
Epoch 12/20
 - 2s - loss: 0.0376 - val_loss: 0.0840
Epoch 13/20
 - 2s - loss: 0.0376 - val_loss: 0.0831
Epoch 14/20
 - 2s - loss: 0.0376 - val_loss: 0.0832
Epoch 15/20
 - 2s - loss: 0.0376 - val_loss: 0.0834
Epoch 16/20
 - 2s - loss: 0.0376 - val_loss: 0.0837
Epoch 17/20
 - 2s - loss: 0.0376 - val_loss: 0.0834
Epoch 18/20
 - 2s - loss: 0.0376 - val_loss: 0.0831
Epoch 19/20
 - 2s - loss: 0.0376 - val_loss: 0.0832
Epoch 20/20
 - 2s - loss: 0.0376 - val_loss: 0.0834
Mean Squared Error on the training data: 0.03988
Mean Squared Error on the test data:     0.03817
======================================================================================================
===================
Plot: 68 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_69 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0081 - val_loss: 0.0095
Epoch 2/20
 - 2s - loss: 4.6727e-04 - val_loss: 4.8987e-04
Epoch 3/20
 - 2s - loss: 3.2141e-04 - val_loss: 5.0690e-04
Epoch 4/20
 - 2s - loss: 3.1444e-04 - val_loss: 5.9234e-04
Epoch 5/20
 - 2s - loss: 3.0939e-04 - val_loss: 5.3222e-04
Epoch 6/20
 - 2s - loss: 3.0584e-04 - val_loss: 5.2147e-04
Epoch 7/20
 - 2s - loss: 3.0067e-04 - val_loss: 4.9447e-04
Epoch 8/20
 - 2s - loss: 2.9740e-04 - val_loss: 5.1113e-04
Epoch 9/20
 - 2s - loss: 2.9756e-04 - val_loss: 4.2983e-04
Epoch 10/20
 - 2s - loss: 2.9384e-04 - val_loss: 6.7533e-04
Epoch 11/20
 - 2s - loss: 2.9323e-04 - val_loss: 4.9208e-04
Epoch 12/20
 - 2s - loss: 2.8892e-04 - val_loss: 4.4772e-04
Epoch 13/20
 - 2s - loss: 2.8742e-04 - val_loss: 4.6178e-04
Epoch 14/20
 - 2s - loss: 2.8448e-04 - val_loss: 4.4102e-04
Epoch 15/20
 - 2s - loss: 2.8468e-04 - val_loss: 4.4621e-04
Epoch 16/20
 - 2s - loss: 2.8533e-04 - val_loss: 4.2728e-04
Epoch 17/20
 - 2s - loss: 2.8743e-04 - val_loss: 3.8750e-04
Epoch 18/20
 - 2s - loss: 2.8366e-04 - val_loss: 4.1807e-04
Epoch 19/20
 - 2s - loss: 2.8240e-04 - val_loss: 5.2730e-04
Epoch 20/20
 - 2s - loss: 2.8150e-04 - val_loss: 4.1986e-04
Mean Squared Error on the training data: 0.00029
Mean Squared Error on the test data:     0.00050
======================================================================================================
===================
Plot: 69 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_70 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0151 - val_loss: 0.0054
Epoch 2/20
 - 2s - loss: 0.0022 - val_loss: 0.0033
Epoch 3/20
 - 2s - loss: 0.0019 - val_loss: 0.0030
Epoch 4/20
 - 3s - loss: 0.0019 - val_loss: 0.0028
Epoch 5/20
 - 3s - loss: 0.0018 - val_loss: 0.0027
Epoch 6/20
 - 2s - loss: 0.0018 - val_loss: 0.0026
Epoch 7/20
 - 3s - loss: 0.0018 - val_loss: 0.0030
Epoch 8/20
 - 3s - loss: 0.0018 - val_loss: 0.0029
Epoch 9/20
 - 2s - loss: 0.0017 - val_loss: 0.0024
Epoch 10/20
 - 2s - loss: 0.0017 - val_loss: 0.0023
Epoch 11/20
 - 2s - loss: 0.0017 - val_loss: 0.0024
Epoch 12/20
 - 3s - loss: 0.0017 - val_loss: 0.0024
Epoch 13/20
 - 3s - loss: 0.0017 - val_loss: 0.0022
Epoch 14/20
 - 3s - loss: 0.0017 - val_loss: 0.0023
Epoch 15/20
 - 2s - loss: 0.0017 - val_loss: 0.0022
Epoch 16/20
 - 3s - loss: 0.0016 - val_loss: 0.0021
Epoch 17/20
 - 3s - loss: 0.0016 - val_loss: 0.0021
Epoch 18/20
 - 2s - loss: 0.0016 - val_loss: 0.0022
Epoch 19/20
 - 2s - loss: 0.0016 - val_loss: 0.0022
Epoch 20/20
 - 2s - loss: 0.0016 - val_loss: 0.0021
Mean Squared Error on the training data: 0.00163
Mean Squared Error on the test data:     0.00163
======================================================================================================
===================
Plot: 70 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_71 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0233 - val_loss: 0.0462
Epoch 2/20
 - 2s - loss: 0.0157 - val_loss: 0.0454
Epoch 3/20
 - 2s - loss: 0.0155 - val_loss: 0.0453
Epoch 4/20
 - 3s - loss: 0.0155 - val_loss: 0.0455
Epoch 5/20
 - 2s - loss: 0.0154 - val_loss: 0.0453
Epoch 6/20
 - 2s - loss: 0.0154 - val_loss: 0.0453
Epoch 7/20
 - 2s - loss: 0.0154 - val_loss: 0.0453
Epoch 8/20
 - 2s - loss: 0.0154 - val_loss: 0.0452
Epoch 9/20
 - 2s - loss: 0.0154 - val_loss: 0.0453
Epoch 10/20
 - 2s - loss: 0.0154 - val_loss: 0.0452
Epoch 11/20
 - 3s - loss: 0.0154 - val_loss: 0.0452
Epoch 12/20
 - 2s - loss: 0.0154 - val_loss: 0.0453
Epoch 13/20
 - 2s - loss: 0.0154 - val_loss: 0.0452
Epoch 14/20
 - 2s - loss: 0.0154 - val_loss: 0.0453
Epoch 15/20
 - 2s - loss: 0.0154 - val_loss: 0.0453
Epoch 16/20
 - 2s - loss: 0.0154 - val_loss: 0.0452
Epoch 17/20
 - 2s - loss: 0.0154 - val_loss: 0.0453
Epoch 18/20
 - 2s - loss: 0.0154 - val_loss: 0.0452
Epoch 19/20
 - 2s - loss: 0.0154 - val_loss: 0.0452
Epoch 20/20
 - 2s - loss: 0.0154 - val_loss: 0.0452
Mean Squared Error on the training data: 0.01688
Mean Squared Error on the test data:     0.10303
======================================================================================================
===================
Plot: 71 (out of 71)
===================
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
dense_72 (Dense)             (None, 7)                 56        
=================================================================
Total params: 56
Trainable params: 56
Non-trainable params: 0
_________________________________________________________________
Train on 988 samples, validate on 53 samples
Epoch 1/20
 - 3s - loss: 0.0380 - val_loss: 0.1918
Epoch 2/20
 - 2s - loss: 0.0341 - val_loss: 0.1899
Epoch 3/20
 - 2s - loss: 0.0340 - val_loss: 0.1905
Epoch 4/20
 - 2s - loss: 0.0339 - val_loss: 0.1900
Epoch 5/20
 - 3s - loss: 0.0339 - val_loss: 0.1904
Epoch 6/20
 - 2s - loss: 0.0339 - val_loss: 0.1903
Epoch 7/20
 - 3s - loss: 0.0339 - val_loss: 0.1900
Epoch 8/20
 - 3s - loss: 0.0339 - val_loss: 0.1900
Epoch 9/20
 - 2s - loss: 0.0339 - val_loss: 0.1903
Epoch 10/20
 - 3s - loss: 0.0339 - val_loss: 0.1903
Epoch 11/20
 - 2s - loss: 0.0339 - val_loss: 0.1900
Epoch 12/20
 - 2s - loss: 0.0339 - val_loss: 0.1906
Epoch 13/20
 - 2s - loss: 0.0339 - val_loss: 0.1910
Epoch 14/20
 - 2s - loss: 0.0339 - val_loss: 0.1911
Epoch 15/20
 - 2s - loss: 0.0339 - val_loss: 0.1912
Epoch 16/20
 - 2s - loss: 0.0339 - val_loss: 0.1899
Epoch 17/20
 - 2s - loss: 0.0339 - val_loss: 0.1900
Epoch 18/20
 - 3s - loss: 0.0339 - val_loss: 0.1902
Epoch 19/20
 - 2s - loss: 0.0339 - val_loss: 0.1902
Epoch 20/20
 - 2s - loss: 0.0339 - val_loss: 0.1900
Mean Squared Error on the training data: 0.04180
Mean Squared Error on the test data:     0.20436
======================================================================================================
======================================================================================================
Total run time in seconds: 2593

Long Short Term Memory (LSTM)

In [34]:
# Define the LSTM model
def LSTM_model(inputs, output_size, neurons, activ_func="linear",
                dropout=0.5, loss="mean_squared_error", optimizer="adam"):
    model = Sequential()
    
    model.add(LSTM(neurons, input_shape=(inputs.shape[1], inputs.shape[2])))
    #model.add(Activation('tanh'))
    model.add(Dropout(dropout))
    
    model.add(Dense(units=output_size))
    #model.add(LeakyReLU())
    model.add(Activation(activ_func))
    
    model.compile(loss=loss, optimizer=optimizer)
    
    model.summary()
    return model


# worse with sigmoid activation function
# tanh resulted in overfitting (better result on the training data and worse on the test data)
#roughly same training time as before

MinMaxScale the data

In [56]:
# Create a copy to keep scaled and normalized data apart. [Have to use copy.deepcopy()]

scaled_norm_stock_prices = copy.deepcopy(norm_stock_prices)
In [57]:
LSTM_train_list, LSTM_test_list = [], []    # Create lists to store the train and test dataframes
norm_train_list, norm_test_list = [], []

# Create train and test sets for the stocks
for stock_price in range(1, len(norm_stock_prices)):
    test_size = int(len(scaled_norm_stock_prices[stock_price]) * 0.20)
    
    # Normalized
    norm_train = norm_stock_prices[stock_price][test_size:]
    norm_test = norm_stock_prices[stock_price][0:test_size]
    norm_train_list.append(norm_train)
    norm_test_list.append(norm_test)  
    
    # Normalized and scaled data
    MMscale_data(scaled_norm_stock_prices[stock_price])
    LSTM_train = scaled_norm_stock_prices[stock_price][test_size:]
    LSTM_test = scaled_norm_stock_prices[stock_price][0:test_size]
    # save each one into a list
    LSTM_train_list.append(LSTM_train)
    LSTM_test_list.append(LSTM_test)  

print("Training samples: {0}".format(len(LSTM_train)))
print("Test samples:     {0}".format(len(LSTM_test)))
print(LSTM_train.shape)
Training samples: 1042
Test samples:     260
(1042, 7)

Predicting one day with the LSTM

In [58]:
# convert an array of values into a dataset matrix.
# window, is the number of previous time steps to use as input variables to predict the next time period
def create_LSTM_dataset(dataset, window=10):
    # dataset is an array
    dataX = [dataset[i:(i+window), :] for i in range(len(dataset)-window)]
    dataY = [dataset[j + window, 4] for j in range(len(dataset)-window)]
    return np.array(dataX), np.array(dataY)

# create_LSTM_dataset and create_LSTM_dataset2 produce the exact same results.

def create_LSTM_dataset2(dataset, window=10):
    # dataset is an array. window is the number of historical datapoints the predictions 
    # are based on while pred_len is the prediction length.
    dataX, dataY = [], []
    for i in range(len(dataset)-window):
        dataX.append(dataset[i:(i+window), :])
        dataY.append(dataset[i+window, 4])
    return np.array(dataX), np.array(dataY)


# (LSTM's apperently work best with time steps in the size of 200-400 steps. I'll opt for 200. )
In [59]:
# Check if the scaling went as expected. 
display(scaled_norm_stock_prices[10].head())
Open High Low Close Adj Close Volume Volatility
BIOT.ST__Date
2018-03-02 0.823461 0.818182 0.802834 0.799534 0.803186 0.005035 0.261649
2018-03-01 0.867596 0.862189 0.833530 0.825175 0.828360 0.014571 0.352919
2018-02-28 0.825784 0.883034 0.831169 0.874126 0.876419 0.027687 0.548972
2018-02-27 0.847851 0.844818 0.831169 0.827506 0.830648 0.011233 0.246076
2018-02-26 0.836237 0.847134 0.848878 0.846154 0.848956 0.008849 0.133475
In [60]:
display(norm_stock_prices[10].head())
Open High Low Close Adj Close Volume Volatility
BIOT.ST__Date
2018-03-02 9.8625 9.563637 9.5000 9.365854 11.622211 1.077271 1.176173
2018-03-01 10.3375 10.024242 9.8250 9.634146 11.955139 3.117635 1.586457
2018-02-28 9.8875 10.242424 9.8000 10.146341 12.590728 5.923744 2.467762
2018-02-27 10.1250 9.842424 9.8000 9.658536 11.985405 2.403273 1.106171
2018-02-26 10.0000 9.866667 9.9875 9.853659 12.227535 1.893361 0.600000

Create inputs for the LSTM model

Specify how many days our model will base its predictions on by changing the window parameter.

In [61]:
# The predictions are far more reliable when using the scaled input data rather than the unscaled for the LSTM model. 
# (train_scaled and test_scaled are far better than train and test). The difference in loss after 5 epochs is 
# 1/250th in favour for the scaled values. 


window=1

# get ticker
tick = get_ticker(LSTM_train_list[0])  

"""Create the datasets"""
LSTM_train_input, LSTM_train_output = create_LSTM_dataset(LSTM_train_list[0].values, window)
LSTM_test_input, LSTM_test_output = create_LSTM_dataset(LSTM_test_list[0].values, window)

'''reshape the input to be [samples, time steps, features]'''
LSTM_test_input = np.reshape(LSTM_test_input, (LSTM_test_input.shape[0], LSTM_test_input.shape[1], 7))
LSTM_train_input = np.reshape(LSTM_train_input, (LSTM_train_input.shape[0], LSTM_train_input.shape[1], 7))


print(LSTM_train_input.shape)
print(LSTM_train_output.shape)
print('-----------')
print(LSTM_test_input.shape)
print(LSTM_test_output.shape)


# Check whether two arrays are equal
#print(np.array_equal(LSTM_train_input, testx))
#print(np.array_equal(LSTM_train_output, testy))
(1041, 1, 7)
(1041,)
-----------
(259, 1, 7)
(259,)
In [62]:
# Random seed for reproducibility
np.random.seed(2)

# Create the model
model = LSTM_model(LSTM_train_input, output_size = 1, neurons=20)
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
lstm_3 (LSTM)                (None, 20)                2240      
_________________________________________________________________
dropout_3 (Dropout)          (None, 20)                0         
_________________________________________________________________
dense_75 (Dense)             (None, 1)                 21        
_________________________________________________________________
activation_3 (Activation)    (None, 1)                 0         
=================================================================
Total params: 2,261
Trainable params: 2,261
Non-trainable params: 0
_________________________________________________________________

Train the LSTM model

In [63]:
trained_LSTM = model.fit(LSTM_train_input, LSTM_train_output, epochs=20, 
                         batch_size=1, verbose=1, shuffle=True, validation_split=0.05)
Train on 988 samples, validate on 53 samples
Epoch 1/20
988/988 [==============================] - 10s 10ms/step - loss: 0.0123 - val_loss: 0.0036
Epoch 2/20
988/988 [==============================] - 8s 8ms/step - loss: 0.0046 - val_loss: 0.0020
Epoch 3/20
988/988 [==============================] - 8s 8ms/step - loss: 0.0034 - val_loss: 2.5097e-04
Epoch 4/20
988/988 [==============================] - 8s 8ms/step - loss: 0.0030 - val_loss: 0.0028
Epoch 5/20
988/988 [==============================] - 8s 8ms/step - loss: 0.0025 - val_loss: 0.0012
Epoch 6/20
988/988 [==============================] - 8s 8ms/step - loss: 0.0024 - val_loss: 0.0038
Epoch 7/20
988/988 [==============================] - 8s 8ms/step - loss: 0.0035 - val_loss: 0.0035
Epoch 8/20
988/988 [==============================] - 8s 8ms/step - loss: 0.0025 - val_loss: 0.0034
Epoch 9/20
988/988 [==============================] - 8s 8ms/step - loss: 0.0024 - val_loss: 0.0016
Epoch 10/20
988/988 [==============================] - 8s 8ms/step - loss: 0.0025 - val_loss: 0.0029
Epoch 11/20
988/988 [==============================] - 8s 8ms/step - loss: 0.0025 - val_loss: 0.0015
Epoch 12/20
988/988 [==============================] - 8s 8ms/step - loss: 0.0020 - val_loss: 0.0013
Epoch 13/20
988/988 [==============================] - 8s 8ms/step - loss: 0.0026 - val_loss: 0.0016
Epoch 14/20
988/988 [==============================] - 8s 8ms/step - loss: 0.0023 - val_loss: 0.0015
Epoch 15/20
988/988 [==============================] - 8s 8ms/step - loss: 0.0023 - val_loss: 0.0011
Epoch 16/20
988/988 [==============================] - 8s 8ms/step - loss: 0.0022 - val_loss: 0.0010
Epoch 17/20
988/988 [==============================] - 8s 8ms/step - loss: 0.0022 - val_loss: 9.3101e-04
Epoch 18/20
988/988 [==============================] - 7s 7ms/step - loss: 0.0022 - val_loss: 0.0014
Epoch 19/20
988/988 [==============================] - 8s 8ms/step - loss: 0.0022 - val_loss: 6.4651e-04
Epoch 20/20
988/988 [==============================] - 8s 8ms/step - loss: 0.0023 - val_loss: 9.2262e-04

Plot the Training error

In [64]:
plot_error(trained_LSTM)

trainScore = model.evaluate(LSTM_train_input, LSTM_train_output, verbose=0)
testScore = model.evaluate(LSTM_test_input, LSTM_test_output, verbose=0)
print("Mean Squared Error on the training data: {0:0.6f}".format(trainScore))
print("Mean Squared Error on the test data:     {0:0.6f}".format(testScore))
Mean Squared Error on the training data: 0.000263
Mean Squared Error on the test data:     0.000362

Performance on the training and test sets

In [65]:
# A function for plotting three dataframes and a zoomed in plot
def plot_LSTM(data_df1, data_df2, data_df3, window_length, title='', xlabel='', ylabel='', zoom=True):
    """data_df1 contains the train set, data_df2 contains the test set and data_df3 contains the entire dataset. 
       title is the plot title. If a zoomed in window is desired, set zoom to True"""
    line_w, line_zoom = 1.0, 1.5
    plt.figure(figsize=(10, 5))
    # Plot the predicted train and test data
    pl = data_df1.plot(color='orchid', fontsize=12, figsize=(16, 7), label=data_df1.columns[0], linewidth=line_w)
    diff = len(data_df3)-len(data_df2)-len(data_df1)
    pred = np.empty_like(data_df3)
    pred[:, :] = np.nan
    pred[len(data_df1)+diff:len(data_df3), :] = data_df2
    ###pred[len(data_df1)+window_length+1:len(data_df3), :] = data_df2
    plt.plot(pred, color='darkorange', label=data_df2.columns[0], linewidth=line_w)
    
    # Plot the actual values
    plt.plot(data_df3, color='green', label=data_df3.columns[0], linewidth=line_w) 
    
    pl.set_title(label=title, fontsize=20)
    pl.set_xlabel(xlabel, fontsize=15) 
    plt.autoscale(enable=True, axis='x', tight=True)    
    pl.set_ylabel(ylabel, fontsize=15)
    plt.legend(fontsize=12, loc='upper left')
    plt.grid(axis='both', alpha=.5)
    pl.xaxis.set_major_locator(MaxNLocator(12))
    pl.xaxis.set_major_formatter(IndexFormatter(data_df3.index[::-1]))   
    plt.xticks(rotation=50, horizontalalignment='center', rotation_mode='default')
    
    if zoom:
        ## The zoomed in window
        lg = int(len(data_df3)*0.1)
        axins = zoomed_inset_axes(pl, 2, loc=9)
        axins.plot(data_df1, color='orchid', linewidth=line_zoom)
        axins.plot(pred, color='darkorange', label=data_df2.columns[0], linewidth=line_zoom)
        axins.plot(data_df3.loc[::-1], color='green', linewidth=line_zoom)
        x1, x2 = data_df1.index[-lg//2,], data_df2.index[lg]    # specify the limits
        
        # Check for the max and min y values in the actual values, within the x limits.
        yA1 = data_df3.loc[data_df3.index[test_size-lg]:data_df3.index[test_size+lg//2],'Actual Data'].min()
        yA2 = data_df3.loc[data_df3.index[test_size-lg]:data_df3.index[test_size+lg//2],'Actual Data'].max()
        
        # Check for the max and min y values in the train set, within the x limits.
        yTr1 = data_df1.iloc[-lg//2:, 0].min()
        yTr2 = data_df1.iloc[-lg//2:, 0].max()
        
        # Check for the max and min y values in the test set, within the x limits.
        yTe1 = data_df2.iloc[:lg, 0].min()
        yTe2 = data_df2.iloc[:lg, 0].max()
        
        ys = [yA1, yA2, yTr1, yTr2, yTe1, yTe2]
        ymax, ymin = max(ys), min(ys)                       # find the max and min values among the different y's
        axins.set_xlim(x1, x2), axins.set_ylim(ymin, ymax)         # apply the x-limits, apply the y-limits
        #plt.yticks(visible=False), plt.xticks(visible=False)      # Remove the tickers
        axins.set_facecolor('whitesmoke')
        axins.axis[:].set_visible(False)                           # Remove the 4 borders
        mark_inset(pl, axins, loc1=2, loc2=4, fc="none", ec="1.5") # Add some lines for the zoom effect
    plt.show()

Invert the scaling

In [66]:
# Prediction output har alltid 1 column (jag valde ju det när LSTM designades).

# Make predictions for the train set. Then invert the scaling.
LSTM_train_pred = model.predict(copy.deepcopy(LSTM_train_input))
LSTM_train_pred = Un_scale_data(copy.deepcopy(LSTM_train_pred), tick)
LSTM_train_output = Un_scale_data(copy.deepcopy(LSTM_train_output), tick)

# Make predictions for the test set. Then invert the scaling.
LSTM_test_pred = model.predict(copy.deepcopy(LSTM_test_input))
LSTM_test_pred = Un_scale_data(copy.deepcopy(LSTM_test_pred), tick)
LSTM_test_output = Un_scale_data(copy.deepcopy(LSTM_test_output), tick)
In [67]:
print(LSTM_train_pred.shape)
print(LSTM_train_output.shape)
print(LSTM_test_pred.shape)
print(LSTM_test_output.shape)
(1041, 1)
(1041, 1)
(259, 1)
(259, 1)
In [68]:
df1 = pd.DataFrame(data=LSTM_train_pred, index=LSTM_train_list[0].index[:-window], 
                   columns=['LSTM Predictions on Train set'])
df2 = pd.DataFrame(data=LSTM_test_pred, index=LSTM_test_list[0].index[:-window], 
                   columns=['LSTM Predictions on Test Set'])
df3 = pd.DataFrame(data=norm_stock_prices[1].loc[:, 'Adj Close'][:-window], index=glob_index[:-window])
df3.columns = ['Actual Data']

name = get_ticker(LSTM_train_list[0])
plot_LSTM(df1[::-1], df2[::-1], df3, window_length=window, 
          title= 'LSTM Single Day Performance on the Training and Test Sets, ' + name, 
          xlabel='Date', ylabel='Price', zoom=True)
<matplotlib.figure.Figure at 0x1a28a2a940>

Our LSTM seems to predict the changes in the stock price fairly well. Even very well on the training set actually. However, this isn't surprising given that it is the data that the model has been training on. More importantly is how it performs on the unseen test data (orange) where it still gives some quite acceptable predictions. There are a few misses but at large, the results are good.

In [69]:
# def many_LSTM_models(nbr_of_plots=3)
def many_LSTM_models():
    global_time = time.time()
    nbr = 1
    window=1
    for i in range(1, len(LSTM_train_list)):
        print('===================')
        print('Plot: {0} (out of {1})'.format(nbr, len(LSTM_train_list)-1))
        print('===================')
        
        LSTM_train_input, LSTM_train_output = create_LSTM_dataset(LSTM_train_list[i].values, window)
        LSTM_test_input, LSTM_test_output = create_LSTM_dataset(LSTM_test_list[i].values, window)
        '''reshape the input to be [samples, time steps, features]'''
        LSTM_test_input = np.reshape(LSTM_test_input, (LSTM_test_input.shape[0], LSTM_test_input.shape[1], 7))
        LSTM_train_input = np.reshape(LSTM_train_input, (LSTM_train_input.shape[0], LSTM_train_input.shape[1], 7))
        
        # Random seed for reproducibility
        np.random.seed(2)
        model = LSTM_model(LSTM_train_input, output_size = 1, neurons=20)
        trained_LSTM = model.fit(LSTM_train_input, LSTM_train_output, epochs=20, 
                                 batch_size=1, verbose=1, shuffle=True, validation_split=0.05)
        
        trainScore = model.evaluate(LSTM_train_input, LSTM_train_output, verbose=0)
        testScore = model.evaluate(LSTM_test_input, LSTM_test_output, verbose=0)
        print("Mean Squared Error on the training data: {0:0.6f}".format(trainScore))
        print("Mean Squared Error on the test data:     {0:0.6f}".format(testScore))
        
        tick = get_ticker(LSTM_train_list[i])
        # Make predictions for the train set. Then invert the scaling.
        LSTM_train_pred = model.predict(copy.deepcopy(LSTM_train_input))
        LSTM_train_pred = Un_scale_data(copy.deepcopy(LSTM_train_pred), tick)
        LSTM_train_output = Un_scale_data(copy.deepcopy(LSTM_train_output), tick)

        # Make predictions for the test set. Then invert the scaling.
        LSTM_test_pred = model.predict(copy.deepcopy(LSTM_test_input))
        LSTM_test_pred = Un_scale_data(copy.deepcopy(LSTM_test_pred), tick)
        LSTM_test_output = Un_scale_data(copy.deepcopy(LSTM_test_output), tick)
        
        df1 = pd.DataFrame(data=LSTM_train_pred, index=LSTM_train_list[i].index[:-window], 
                           columns=['LSTM Predictions on Train set'])
        df2 = pd.DataFrame(data=LSTM_test_pred, index=LSTM_test_list[i].index[:-window], 
                           columns=['LSTM Predictions on Test Set'])
        df3 = pd.DataFrame(data=norm_stock_prices[1+i].loc[:, 'Adj Close'][:-window], index=glob_index[:-window])
        df3.columns = ['Actual Data']

        name = get_ticker(LSTM_train_list[i])
        plot_LSTM(df1[::-1], df2[::-1], df3, window_length=window, 
                  title= 'LSTM Single Day Performance on the Training and Test Sets, ' + name, 
                  xlabel='Date', ylabel='Price', zoom=True)
        nbr += 1
        print('======================================================================================================')
    print('======================================================================================================')
    print('Total run time in seconds: {0:0.0f}'.format(time.time()-global_time))
    
    
   
    
In [18]:
#many_LSTM_models()

Predicting 10 days ahead with the LSTM

In [71]:
from keras import regularizers

# Define the LSTM model
def LSTM10_model(inputs, output_size, neurons, activ_func="linear",
                dropout=0.2, loss="mean_squared_error", optimizer="rmsprop"):
    model = Sequential()
    
    model.add(LSTM(neurons, input_shape=(inputs.shape[1], inputs.shape[2]), return_sequences=True))
    model.add(Dropout(dropout))
    
    model.add(LSTM(neurons*2, return_sequences=False))
    model.add(Dropout(dropout))
    
    model.add(Dense(units=output_size))
    model.add(Activation(activ_func))
    
    model.compile(loss=loss, optimizer=optimizer)
    
    model.summary()
    return model
In [72]:
def create_LSTM10_dataset(dataset, window, pred_len=10):
    # dataset is an array. window is the number of historical datapoints the predictions 
    # are based on while pred_len is the prediction length.
    dataX, dataY = [], []
    for i in range(len(dataset)-window-pred_len+1):
        dataX.append(dataset[i:(i+window), :])
        dataY.append(dataset[(i + window):(i + window + pred_len), 4])
    return np.array(dataX), np.array(dataY)
In [73]:
# Specify for how many days we want to predict the price by changing the into_the_future parameter.
into_the_future = 10

""""Define the input data for the 10 days LSTM prediction"""
window=10
LSTM10_train_input, LSTM10_train_output = create_LSTM_dataset(LSTM_train_list[0].values, window)
LSTM10_test_input, LSTM10_test_output = create_LSTM_dataset(LSTM_test_list[0].values, window)
In [74]:
'''reshape the input to be [samples, time steps, features]'''
LSTM10_test_input = np.reshape(LSTM10_test_input, (LSTM10_test_input.shape[0], LSTM10_test_input.shape[1], 7))
LSTM10_train_input = np.reshape(LSTM10_train_input, (LSTM10_train_input.shape[0], LSTM10_train_input.shape[1], 7))


print(LSTM10_train_input.shape)
print(LSTM10_test_input.shape)
print('--------------------')
print(LSTM10_train_output.shape)
print(LSTM10_test_output.shape)
(1032, 10, 7)
(250, 10, 7)
--------------------
(1032,)
(250,)
In [75]:
# Random seed for reproducibility
#np.random.seed(17)
np.random.seed(202)

model_10 = LSTM10_model(LSTM10_train_input, output_size = 1, neurons=50)
_________________________________________________________________
Layer (type)                 Output Shape              Param #   
=================================================================
lstm_75 (LSTM)               (None, 10, 50)            11600     
_________________________________________________________________
dropout_75 (Dropout)         (None, 10, 50)            0         
_________________________________________________________________
lstm_76 (LSTM)               (None, 100)               60400     
_________________________________________________________________
dropout_76 (Dropout)         (None, 100)               0         
_________________________________________________________________
dense_147 (Dense)            (None, 1)                 101       
_________________________________________________________________
activation_75 (Activation)   (None, 1)                 0         
=================================================================
Total params: 72,101
Trainable params: 72,101
Non-trainable params: 0
_________________________________________________________________
In [76]:
start = time.time()

trained_LSTM10 = model_10.fit(LSTM10_train_input, LSTM10_train_output, 
                              epochs=1, batch_size=2, verbose=1, shuffle=True, validation_split=0.05)

print('Total training time (s): {0:0.0f}'.format(time.time()-start))
Train on 980 samples, validate on 52 samples
Epoch 1/1
980/980 [==============================] - 35s 36ms/step - loss: 0.0031 - val_loss: 0.0024
Total training time (s): 37

Plot the Training error

In [77]:
#plot_error(trained_LSTM10)

trainScore = model_10.evaluate(LSTM10_train_input, LSTM10_train_output, verbose=0)
testScore = model_10.evaluate(LSTM10_test_input, LSTM10_test_output, verbose=0)
print("Mean Squared Error on the training data: {0:0.6f}".format(trainScore)) 
print("Mean Squared Error on the test data:     {0:0.6f}".format(testScore))
Mean Squared Error on the training data: 0.000703
Mean Squared Error on the test data:     0.002080
In [78]:
# Prediction output har alltid 1 column (jag valde ju det när LSTM designades).



# Make predictions for the train set. Then invert the scaling.
#LSTM10_train_pred = model_10.predict(LSTM10_train_input[:-into_the_future])
#print(LSTM10_train_pred.shape)
#LSTM10_train_pred = Un_scale_data(LSTM10_train_pred, tickr)
#LSTM10_train_output = Un_scale_data(LSTM10_train_output, tickr) ##

Performance on the Test Set

Let's focus on what's important and interesting, namely the performance on the test set.

In [79]:
### DESSA TVÅ FUNKTIONER MÅSTE JAG ÄNDRA! JAG HAR KOPIERAT DEM RAKT AV. 

def plot_long_pred(pred_data, true_data, pred_len, title='', xlabel='', ylabel=''):
    """ Plot the predictions stored in pred_data and the true values stored in true_data.
        pred_len is the length of each prediction. """
    index = true_data.index
    fig = plt.figure(figsize=(16, 7), facecolor='white')
    ax = fig.add_subplot(111)
    ax.plot(true_data.values[:, 0][::-1], label='True Data')
    #Pad the list of predictions to shift it in the graph to its correct start
    for i, data in enumerate(pred_data):
        padding = [None for p in range(i * pred_len)]
        #Show the legend only for the first 5 predictions
        if i < 5:
            plt.plot((padding + data), label='Prediction')
            plt.legend()
        else:
            plt.plot(padding + data)
    ax.set_title(label=title, fontsize=20)
    ax.set_xlabel(xlabel, fontsize=15) 
    ax.autoscale(enable=True, axis='x', tight=True)    
    ax.set_ylabel(ylabel, fontsize=15)
    ax.grid(axis='both', alpha=.5)
    ax.xaxis.set_major_locator(MaxNLocator(12))
    ax.xaxis.set_major_formatter(IndexFormatter(index[::-1]))
    plt.setp(ax.get_xticklabels(), rotation=50, fontsize=12)
    plt.setp(ax.get_yticklabels(), fontsize=12)
    plt.show()


def predict_multiple_sequences(model, data, window_size, pred_len):
    """ Make a sequence of predictions of pred_len steps before shifting prediction run forward by pred_len steps."""
    prediction_seqs = []
    for i in range(int(len(data)/pred_len)):
        curr_frame = data[i*pred_len]
        predicted = []
        for j in range(pred_len):
            predicted.append(model.predict(curr_frame[np.newaxis,:,:])[0,0])
            curr_frame = curr_frame[1:]
            curr_frame = np.insert(curr_frame, [window_size-1], predicted[-1], axis=0)
        prediction_seqs.append(predicted)
    return prediction_seqs
In [80]:
tickr = get_ticker(LSTM_train_list[0])

LSTM10_predictions = predict_multiple_sequences(model_10, LSTM10_test_input[::-1], window, into_the_future)


#inv_LSTM10_predictions = copy.deepcopy(pd.DataFrame(LSTM10_predictions).transpose())
#for i in range(len(LSTM10_predictions)):
#    Un_scale_data(inv_LSTM10_predictions.iloc[:, i], tickr)
#    #column = Un_scale_data(copy.deepcopy(column.values), tickr)
##    inv_LSTM10_predictions.append(list1.tolist())
In [81]:
plot_long_pred(LSTM10_predictions, LSTM_test_list[0], into_the_future,
               title='10 day predictions on test set, ' + tickr, xlabel='Date', ylabel='Price')

Find the optimal algorithm for each stock

Run the cell below to find the optimal algorithm/tuning parameters for each stock. (The run will take several 10:s of hours.)

In [82]:
from keras import regularizers

# Define the LSTM model
def LSTM10_model2(inputs, output_size, neurons, activ_func="linear",
                dropout=0.2, loss="mean_squared_error", optimizer="rmsprop"):
    model = Sequential()

    model.add(LSTM(neurons, input_shape=(inputs.shape[1], inputs.shape[2]), return_sequences=True))
    model.add(Dropout(dropout))
    
    model.add(LSTM(neurons*2, return_sequences=False))
    model.add(Dropout(dropout))
    
    model.add(Dense(units=output_size))
    model.add(Activation(activ_func))
    
    model.compile(loss=loss, optimizer=optimizer)
    
    model.summary()
    return model


def many_LSTM10():
    global_start_time = time.time()
    windows = [10, 20]
    into_the_future = 10
    start = 0
    count = 1
    
    for stock_nbr in range(1):
        for window in windows:
            for batch_size in [1, 2, 10, 50, 100]:
                for epoch in [1, 2]:
                    """"Define the input data for the 10 day LSTM prediction"""
                    LSTM10_train_input, LSTM10_train_output = create_LSTM_dataset(LSTM_train_list[stock_nbr].values, window)
                    LSTM10_test_input, LSTM10_test_output = create_LSTM_dataset(LSTM_test_list[stock_nbr].values, window)
                
                    '''reshape the input to be [samples, time steps, features]'''
                    LSTM10_test_input = np.reshape(LSTM10_test_input, (LSTM10_test_input.shape[0], 
                                                                       LSTM10_test_input.shape[1], 7))
                    LSTM10_train_input = np.reshape(LSTM10_train_input, (LSTM10_train_input.shape[0], 
                                                                         LSTM10_train_input.shape[1], 7))
                
                    print('===============================')
                    print('(Stock, window, batch_size, epoch)')
                    print('{0}, {1}, {2}, {3}'.format(get_ticker(LSTM_train_list[stock_nbr]), 
                                                      window, batch_size, epoch))
                    print('Run: {0} ({1})'.format(count, 1*2*len([1, 2, 10, 50, 100])*2))
                    print('===============================')
                
                    # Random seed for reproducibility
                    np.random.seed(202)
                    model_10 = LSTM10_model2(LSTM10_train_input, output_size = 1, neurons=50)
                    trained_LSTM10 = model_10.fit(LSTM10_train_input, LSTM10_train_output, epochs=epoch, 
                                                  batch_size=batch_size, verbose=1, 
                                                  shuffle=True, validation_split=0.05)
                
                    #plot_error(trained_LSTM10)
                    trainScore = model_10.evaluate(LSTM10_train_input, LSTM10_train_output, verbose=0)
                    testScore = model_10.evaluate(LSTM10_test_input, LSTM10_test_output, verbose=0)
                    print("Mean Squared Error on the training data: {0:0.6f}".format(trainScore)) 
                    print("Mean Squared Error on the test data:     {0:0.6f}".format(testScore))
                
                    tickr = get_ticker(LSTM_train_list[stock_nbr])
                    LSTM10_predictions = predict_multiple_sequences(model_10, LSTM10_test_input[::-1], 
                                                                    window, into_the_future)
                
                    plot_long_pred(LSTM10_predictions, LSTM_test_list[stock_nbr], into_the_future,
                           title='10 day predictions on test set, ' + tickr, xlabel='Date', ylabel='Price')
                
                    count += 1
                    print('==================================================================')
    print('==================================================================')
    print('Total training time (s): {0:0.0f}'.format(time.time()-global_start_time))
    
    
In [83]:
#many_LSTM10()
In [84]:
###### from keras import regularizers

# Define the LSTM model
#def LSTM10_model2(inputs, output_size, neurons, activ_func="linear",
#                dropout=0.2, loss="mean_squared_error", optimizer="rmsprop"):
#    model = Sequential()

#    model.add(LSTM(neurons, input_shape=(inputs.shape[1], inputs.shape[2]), return_sequences=True))
#    model.add(Dropout(dropout))
    
#    model.add(LSTM(neurons*2, return_sequences=False))
#    model.add(Dropout(dropout))
    
#    model.add(Dense(units=output_size))
#    model.add(Activation(activ_func))
    
#    model.compile(loss=loss, optimizer=optimizer)
    
#    model.summary()
#    return model


#def many_LSTM10():
#    global_start_time = time.time()
#    windows = [10, 20]
#    into_the_future = 10
#    start = 0
#    count = 1
#    stocks_to_change = [5, 12, 30, 49, 51, 61, 54]
    
#    for stock_nbr in stocks_to_change:
#        for window in windows:
#            for batch_size in [1, 150, 200]:
#                for epoch in [5, 10]:
#                    """"Define the input data for the 10 day LSTM prediction"""
#                    LSTM10_train_input, LSTM10_train_output = create_LSTM_dataset(LSTM_train_list[stock_nbr].values, window)
#                    LSTM10_test_input, LSTM10_test_output = create_LSTM_dataset(LSTM_test_list[stock_nbr].values, window)
                
#                    '''reshape the input to be [samples, time steps, features]'''
#                    LSTM10_test_input = np.reshape(LSTM10_test_input, (LSTM10_test_input.shape[0], LSTM10_test_input.shape[1], 7))
#                    LSTM10_train_input = np.reshape(LSTM10_train_input, (LSTM10_train_input.shape[0], LSTM10_train_input.shape[1], 7))
                
#                    print('===============================')
#                    print('(Stock, window, batch_size, epoch)')
#                    print('{0}, {1}, {2}, {3}'.format(get_ticker(LSTM_train_list[stock_nbr]), window, batch_size, epoch))
#                    print('Run: {0} ({1})'.format(count, len(stocks_to_change)*2*len([1, 150, 200])*2))
#                    print('===============================')
                
#                    # Random seed for reproducibility
#                    np.random.seed(202)
#                    model_10 = LSTM10_model2(LSTM10_train_input, output_size = 1, neurons=50)
#                    trained_LSTM10 = model_10.fit(LSTM10_train_input, LSTM10_train_output, epochs=epoch, 
#                                                   batch_size=batch_size, verbose=1, shuffle=True, validation_split=0.05)
                
#                    #plot_error(trained_LSTM10)
#                    trainScore = model_10.evaluate(LSTM10_train_input, LSTM10_train_output, verbose=0)
#                    testScore = model_10.evaluate(LSTM10_test_input, LSTM10_test_output, verbose=0)
#                    print("Mean Squared Error on the training data: {0:0.6f}".format(trainScore)) 
#                    print("Mean Squared Error on the test data:     {0:0.6f}".format(testScore))
                
#                    tickr = get_ticker(LSTM_train_list[stock_nbr])
#                    LSTM10_predictions = predict_multiple_sequences(model_10, LSTM10_test_input[::-1], window, into_the_future)
                
#                    plot_long_pred(LSTM10_predictions, LSTM_test_list[stock_nbr], into_the_future,
#                           title='10 day predictions on test set, ' + tickr, xlabel='Date', ylabel='Price')
                
#                    count += 1
#                    print('==================================================================')
#    print('==================================================================')
#    print('Total training time (s): {0:0.0f}'.format(time.time()-global_start_time))
    

Optimal tuning parameters

The optimal tuning parameters are stored in the algorithm_tunings array. (Stock: Window size, batch size, number of epochs)

In [86]:
#(Stock: Window size, batch size, number of epochs)

algorithm_tunings = {'ACAN-B.ST': [10, 1, 1], 'ANOD-B.ST': [10, 50, 1], 'ADDT-B.ST':[10, 2, 1],'AOI.ST':[10, 50, 1],
                    'AQ.ST':[10, 50, 1], 'ARCM.ST':[20, 100, 1], 'BEIA-B.ST': [20, 50, 1], 'BEIJ-B.ST': [10, 50, 1],
                    'BIOG-B.ST':[20, 10, 1], 'BIOT.ST':[10, 50, 1], 'PXXS-SDB.ST':[20, 2, 1], 'BULTEN.ST':[20, 50, 1],
                    'BURE.ST':[20, 50, 2], 'BMAX.ST':[10, 100, 2], 'CAT-A.ST':[10, 50, 1], 'CAT-B.ST': [10, 100, 2],
                    'CATE.ST': [10, 100, 2], 'CCC.ST': [10, 10, 1], 'CEVI.ST': [10, 50, 1], 'CLAS-B.ST': [20, 50, 2],
                    'CLA-B.ST': [10, 100, 2], 'COIC.ST': [20, 50, 1], 'CRED-A.ST':[20, 100, 1], 'DIOS.ST':[20, 50, 1],
                    'DUNI.ST':[10, 100, 2], 'ELAN-B.ST':[20, 100, 1], 'ENQ.ST': [10, 10, 1], 'FAG.ST':[20, 100, 1],
                    'FPAR.ST':[20, 50, 1], 'G5EN.ST':[20, 2, 1], 'GUNN.ST':[20, 50, 1], 'HLDX.ST':[10, 10, 1],
                    'HMED.ST':[20, 100, 1], 'HEBA-B.ST':[20, 100, 2], 'HIQ.ST':[20, 100, 2], 'HMS.ST':[10, 50, 1],
                    'IAR-B.ST':[10, 50, 1], 'IVSO.ST':[10, 50, 2], 'KABE-B.ST':[10, 50, 1], 'KAHL.ST':[10, 100, 2],
                    'KARO.ST':[20, 50, 1], 'KNOW.ST':[10, 100, 1], 'LIAB.ST':[10, 100, 2], 'LUC.ST':[20, 10, 1],
                    'MVIR-B.ST':[10, 2, 2], 'MEKO.ST':[10, 100, 2], 'MSON-A.ST':[10, 100, 1],'MSON-B.ST':[10, 100, 2],
                    'MYCR.ST':[10, 100, 2], 'NMAN.ST':[10, 100, 1], 'NETI-B.ST':[20, 50, 2], 'NEWA-B.ST':[10, 100, 2],
                    'NOLA-B.ST':[10, 100, 1], 'OEM-B.ST':[10, 50, 1], 'OPUS.ST':[10, 100, 2], 'ORX.ST':[10, 1, 1],
                    'PROB.ST':[10, 10, 1], 'QLRO.ST':[20, 2, 2], 'RAY-B.ST':[10, 100, 2], 'REZT.ST':[10, 10, 2],
                    'SAS.ST':[10, 100, 2], 'SMF.ST':[10, 1, 2], 'SKIS-B.ST':[10, 50, 1], 'STAR-B.ST':[10, 2, 1],
                    'SWOL-B.ST':[20, 100, 1], 'SYSR.ST':[20, 50, 1], 'TETY.ST':[10, 100, 1], 'TRAC-B.ST':[20, 100, 2],
                    'VBG-B.ST':[20, 2, 1], 'VITR.ST':[10, 50, 1], 'XVIVO.ST':[20, 50, 2], 'ORES.ST':[20, 50, 2]}
In [87]:
def create_pred_path(ticker):
    """Create a file path to store file(s)"""
    base = '/Users/jakob/Desktop/Programming/Udacity Machine Learning Nano Degree/Capstone Project/Predictions/'
    return(base + ticker + '_Predictions' + '.csv') 

Make predictions for each stock

Make predictions for all the stocks using each individual algorithm. Save the predictions to both an array and a .csv file.

In [88]:
def make_all_10pred():
    all_predictions = {}
    global_start_time = time.time()
    into_the_future = 10
    count = 1
    
    print('Creating predictions... ')
    print()
    
    for stock_nbr in range(len(LSTM_train_list)):
        
        stock_ticker = get_ticker(LSTM_train_list[stock_nbr])
        stock_info = algorithm_tunings[stock_ticker]
        window, batch_size, epoch = stock_info[0], stock_info[1], stock_info[2]
        
        print('===============================')
        print('(Stock, window, batch_size, epoch)')
        print('{0}, {1}, {2}, {3}'.format(stock_ticker, window, batch_size, epoch))
        print('Count: {0} ({1})'.format(count, len(LSTM_train_list)))
        print('===============================')
        
        
        """"Define the input data for the 10 day LSTM prediction"""
        LSTM10_train_input, LSTM10_train_output = create_LSTM_dataset(LSTM_train_list[stock_nbr].values, window)
        LSTM10_test_input, LSTM10_test_output = create_LSTM_dataset(LSTM_test_list[stock_nbr].values, window)
        
        '''reshape the input to be [samples, time steps, features]'''
        LSTM10_test_input = np.reshape(LSTM10_test_input, (LSTM10_test_input.shape[0], LSTM10_test_input.shape[1], 7))
        LSTM10_train_input = np.reshape(LSTM10_train_input, (LSTM10_train_input.shape[0], LSTM10_train_input.shape[1], 7))
        
        # Random seed for reproducibility
        np.random.seed(202)
        model_10 = LSTM10_model(LSTM10_train_input, output_size = 1, neurons=50)
        trained_LSTM10 = model_10.fit(LSTM10_train_input, LSTM10_train_output, epochs=epoch, 
                                      batch_size=batch_size, verbose=1, shuffle=True, validation_split=0.05)
                
        #plot_error(trained_LSTM10)
        trainScore = model_10.evaluate(LSTM10_train_input, LSTM10_train_output, verbose=0)
        testScore = model_10.evaluate(LSTM10_test_input, LSTM10_test_output, verbose=0)
        print("Mean Squared Error on the training data: {0:0.6f}".format(trainScore)) 
        print("Mean Squared Error on the test data:     {0:0.6f}".format(testScore))        
        
        #tickr = get_ticker(LSTM_train_list[stock_nbr])
        LSTM10_predictions = predict_multiple_sequences(model_10, LSTM10_test_input[::-1], window, into_the_future)
        
        # Save the predictions to a .csv file
        pd.DataFrame(LSTM10_predictions).to_csv(create_pred_path(stock_ticker))
        
        # Save the predictions to an array
        all_predictions[stock_ticker] = LSTM10_predictions 
        
        count +=1
        print('======================================================================================')    
        print('======================================================================================') 
        print()
    print('...Done!')
    print('Total run time (s): {0:0.0f}'.format(time.time()-global_start_time)) 
    return all_predictions
    
    
In [89]:
# predictions_10 contains all the predictions for each stock. (The run will take roughly an hour.)
#predictions_10 = make_all_10pred()
In [90]:
#print(pd.DataFrame(predictions_10['ACAN-B.ST']))

Read the predictions from the .csv files

In [91]:
def get_predictions():
    di = '/Users/jakob/Desktop/Programming/Udacity Machine Learning Nano Degree/Capstone Project/Predictions/'
    filePaths = glob(di+"*.csv")  # Get each .csv file in the directory

    # Get all the file names
    file_names_pred = []
    for root, dirs, files in os.walk(di):  
        for filename in files:
            filename = filename[:-4]   # Just keep the ticker name, without the .csv file extention
            file_names_pred.append(filename)
    del file_names_pred[0]

    # Get the predictions from the .csv files
    predictions = []
    for i in range(len(filePaths)):
        predi = pd.read_csv(filePaths[i])
        predi.drop(predi.columns[[0]], axis=1, inplace=True)
        predi.index.names = [file_names_pred[i]]
        predictions.append(predi)

    return predictions
        
predictions_10 = get_predictions()
In [92]:
def calc_accuracy(predictions, true_data):
    # predictions is a 2D dataframe with predicted values. true_data is a dataframe with the true data
    correct, not_correct = 0, 0
    rows, columns = predictions.shape[0], predictions.shape[1]
    for row in range(rows):
        if (predictions.iloc[row,0] < predictions.iloc[row, -1]) and (true_data.iloc[row*columns-columns, 4] < true_data.iloc[row*columns, 4]):  
            correct += 1
        elif (predictions.iloc[row,0] > predictions.iloc[row, -1]) and (true_data.iloc[row*columns-columns, 4] > true_data.iloc[row*columns, 4]):
            correct += 1
        else:
            not_correct += 1
    return (correct / rows)


def get_top_predictions(top_x, pred_list, true_data_list):
    """top_x is the amount of top predictions desired to be returned by the function. 
        pred_list is a list containing all the predictions for all the stocks. 
        true_data_list is a list containing all the true data, stored in df"""
    cor_inc_pred = {}
    for pred in pred_list:
        row, col = pred.shape[0], pred.shape[1]
        tkr = get_ticker(pred)[:-6]
        true_data = get_stock(true_data_list, tkr)[::-1]
        if (pred.iloc[-1, -1] > pred.iloc[-1, 0]) and (true_data.iloc[row*col, 4] > true_data.iloc[row*col-col, 4]):
            cor_inc_pred[tkr] = ((pred.iloc[-1, -1] - pred.iloc[-1, 0]) / pred.iloc[-1, 0])
            
    sorted_cor_inc_pred = sorted(cor_inc_pred.items(), key=operator.itemgetter(1), reverse=True)
    if len(sorted_cor_inc_pred) < top_x:
        print('Fewer correct top predictions were found than asked for. Returning all that were found.\n')
    return sorted_cor_inc_pred[:top_x]
    


def plot_all_10pred(predictions):
    """Plot all the predicted values"""
    into_the_future, good_acc = 10, 0
    for predi in predictions:
        tkr = get_ticker(predi)[:-6]
        true_values = get_stock(LSTM_test_list, tkr)
        plot_long_pred(predi.values.tolist(), true_values, into_the_future,
                       title='10 day predictions on test set, ' + tkr, xlabel='Date', ylabel='Price')
        accuracy = calc_accuracy(predi, true_values)
        if accuracy >= 0.6:
            good_acc += 1
            
        print('Accuracy score for {0}: {1:0.2f}%'.format(tkr, accuracy*100))
        print('==================================================================================================') 
    print('==================================================================================================')
    print()
    print('Total number of satisfying predictions (over 60%): {0} out of {1}'.format(good_acc, len(predictions)))
    
In [19]:
#plot_all_10pred(predictions_10)
In [94]:
top_pred = get_top_predictions(5, predictions_10, LSTM_test_list)
print(top_pred)
[('REZT.ST', 0.16184365216011876), ('TETY.ST', 0.14915278453385278), ('BMAX.ST', 0.13797540005218606), ('CAT-A.ST', 0.099748663312861585), ('CEVI.ST', 0.064334982968075119)]